Apr 24 19:06:46.566916 ip-10-0-129-124 systemd[1]: Starting Kubernetes Kubelet... Apr 24 19:06:46.971098 ip-10-0-129-124 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:46.971098 ip-10-0-129-124 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 19:06:46.971098 ip-10-0-129-124 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:46.971098 ip-10-0-129-124 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 19:06:46.971098 ip-10-0-129-124 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 19:06:46.973251 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.973167 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 19:06:46.977036 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977013 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:46.977036 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977032 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:46.977036 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977037 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:46.977036 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977041 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977045 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977048 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977051 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977054 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977057 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977060 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977062 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977065 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977068 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977071 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977074 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977077 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977079 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977082 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977085 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977088 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977091 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977093 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:46.977221 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977096 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977100 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977108 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977111 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977114 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977116 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977119 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977122 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977124 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977128 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977130 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977134 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977138 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977142 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977145 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977148 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977151 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977154 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977156 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:46.977720 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977159 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977162 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977164 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977167 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977170 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977172 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977175 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977178 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977181 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977183 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977186 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977189 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977191 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977194 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977197 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977199 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977202 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977204 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977207 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977210 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:46.978180 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977212 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977215 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977218 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977220 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977223 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977226 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977229 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977232 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977237 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977239 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977243 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977246 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977250 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977255 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977258 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977261 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977264 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977271 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977274 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977277 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:46.978706 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977280 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977283 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977286 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977288 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977291 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977702 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977708 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977711 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977714 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977717 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977720 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977723 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977726 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977728 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977731 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977735 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977738 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977740 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977743 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977746 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:46.979190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977749 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977751 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977754 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977757 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977759 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977762 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977765 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977767 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977770 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977773 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977775 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977778 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977781 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977783 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977786 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977788 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977791 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977793 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977796 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977799 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:46.979739 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977803 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977806 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977809 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977811 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977814 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977817 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977819 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977822 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977824 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977827 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977830 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977832 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977835 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977838 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977840 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977843 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977845 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977848 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977850 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:46.980354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977853 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977856 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977859 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977862 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977864 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977867 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977869 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977872 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977875 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977877 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977880 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977883 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977886 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977888 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977891 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977894 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977897 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977901 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977904 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977907 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:46.980835 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977910 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977913 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977915 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977917 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977920 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977924 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977926 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977929 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977931 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977934 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977937 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.977939 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978012 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978020 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978026 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978030 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978036 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978040 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978045 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978050 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978054 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 19:06:46.981330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978057 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978060 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978064 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978067 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978070 2564 flags.go:64] FLAG: --cgroup-root="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978073 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978076 2564 flags.go:64] FLAG: --client-ca-file="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978082 2564 flags.go:64] FLAG: --cloud-config="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978085 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978088 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978092 2564 flags.go:64] FLAG: --cluster-domain="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978095 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978098 2564 flags.go:64] FLAG: --config-dir="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978101 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978105 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978109 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978112 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978115 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978119 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978122 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978125 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978128 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978131 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978134 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978138 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 19:06:46.981883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978142 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978145 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978147 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978151 2564 flags.go:64] FLAG: --enable-server="true" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978154 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978161 2564 flags.go:64] FLAG: --event-burst="100" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978165 2564 flags.go:64] FLAG: --event-qps="50" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978168 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978171 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978174 2564 flags.go:64] FLAG: --eviction-hard="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978178 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978181 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978184 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978188 2564 flags.go:64] FLAG: --eviction-soft="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978192 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978195 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978198 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978201 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978204 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978207 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978210 2564 flags.go:64] FLAG: --feature-gates="" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978214 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978218 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978221 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978224 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978227 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 24 19:06:46.982492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978231 2564 flags.go:64] FLAG: --help="false" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978234 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-129-124.ec2.internal" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978237 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978240 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978242 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978246 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978250 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978253 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978256 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978259 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978262 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978267 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978270 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978273 2564 flags.go:64] FLAG: --kube-reserved="" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978276 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978279 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978282 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978285 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978288 2564 flags.go:64] FLAG: --lock-file="" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978291 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978296 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978299 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978305 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 19:06:46.983138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978308 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978311 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978314 2564 flags.go:64] FLAG: --logging-format="text" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978317 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978320 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978323 2564 flags.go:64] FLAG: --manifest-url="" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978326 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978330 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978333 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978338 2564 flags.go:64] FLAG: --max-pods="110" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978341 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978344 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978347 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978350 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978353 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978356 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978359 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978366 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978369 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978372 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978378 2564 flags.go:64] FLAG: --pod-cidr="" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978381 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978387 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978390 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 19:06:46.983715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978393 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978396 2564 flags.go:64] FLAG: --port="10250" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978400 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978402 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09239ffe4fa0a0986" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978406 2564 flags.go:64] FLAG: --qos-reserved="" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978410 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978413 2564 flags.go:64] FLAG: --register-node="true" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978416 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978419 2564 flags.go:64] FLAG: --register-with-taints="" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978423 2564 flags.go:64] FLAG: --registry-burst="10" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978426 2564 flags.go:64] FLAG: --registry-qps="5" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978429 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978432 2564 flags.go:64] FLAG: --reserved-memory="" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978436 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978439 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978442 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978445 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978448 2564 flags.go:64] FLAG: --runonce="false" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978451 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978454 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978457 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978460 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978463 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978467 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978470 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978473 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 19:06:46.984320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978476 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978478 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978481 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978486 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978490 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978493 2564 flags.go:64] FLAG: --system-cgroups="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978496 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978501 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978504 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978506 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978511 2564 flags.go:64] FLAG: --tls-min-version="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978515 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978518 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978521 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978524 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978527 2564 flags.go:64] FLAG: --v="2" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978532 2564 flags.go:64] FLAG: --version="false" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978536 2564 flags.go:64] FLAG: --vmodule="" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978540 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978543 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978649 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978653 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978656 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978660 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:46.984978 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978663 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978665 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978668 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978671 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978674 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978678 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978681 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978685 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978688 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978691 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978693 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978698 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978701 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978704 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978707 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978710 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978713 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978717 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978721 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:46.985594 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978729 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978733 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978736 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978738 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978742 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978744 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978747 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978750 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978753 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978756 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978758 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978761 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978764 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978766 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978769 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978772 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978775 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978777 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978780 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978782 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:46.986085 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978785 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978788 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978791 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978793 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978797 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978799 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978802 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978805 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978808 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978810 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978813 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978815 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978819 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978822 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978825 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978828 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978830 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978833 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978836 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978838 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:46.986589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978841 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978844 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978846 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978849 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978851 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978854 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978857 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978859 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978862 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978864 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978867 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978870 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978872 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978875 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978877 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978880 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978883 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978886 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978888 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978891 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:46.987083 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978894 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978897 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.978899 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.978905 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.985209 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.985231 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985279 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985284 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985287 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985291 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985294 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985297 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985300 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985304 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985309 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:46.987589 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985312 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985316 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985319 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985322 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985325 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985328 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985330 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985333 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985336 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985339 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985342 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985345 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985347 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985350 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985353 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985355 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985358 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985360 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985363 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:46.987967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985365 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985368 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985370 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985375 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985377 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985380 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985383 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985385 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985388 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985390 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985393 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985396 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985398 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985401 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985403 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985406 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985408 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985411 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985413 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985416 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:46.988431 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985418 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985421 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985424 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985426 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985428 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985432 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985435 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985437 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985439 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985442 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985445 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985447 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985450 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985453 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985455 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985459 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985463 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985465 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985468 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985470 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:46.989190 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985473 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985475 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985478 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985480 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985483 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985485 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985487 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985490 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985493 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985495 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985498 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985500 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985503 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985505 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985508 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985512 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985515 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:46.989832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985519 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.985524 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985638 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985643 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985646 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985649 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985652 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985655 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985658 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985661 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985664 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985666 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985669 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985672 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985675 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 19:06:46.990275 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985679 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985682 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985685 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985688 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985691 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985694 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985697 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985700 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985703 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985705 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985708 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985711 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985713 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985715 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985718 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985722 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985724 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985727 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985730 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 19:06:46.990731 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985732 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985735 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985737 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985740 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985743 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985745 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985748 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985750 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985753 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985756 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985758 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985761 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985764 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985766 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985770 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985774 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985777 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985779 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985782 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985785 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 19:06:46.991250 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985787 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985790 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985792 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985795 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985797 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985800 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985803 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985805 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985808 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985811 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985814 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985816 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985819 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985821 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985824 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985827 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985829 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985832 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985835 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985837 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 19:06:46.991794 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985840 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985843 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985845 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985847 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985850 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985853 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985855 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985858 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985861 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985863 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985866 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985868 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985871 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:46.985874 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.985879 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 19:06:46.992316 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.986576 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 19:06:46.992820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.988542 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 19:06:46.992820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.989345 2564 server.go:1019] "Starting client certificate rotation" Apr 24 19:06:46.992820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.989459 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:46.992820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:46.990653 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 19:06:47.011738 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.011706 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:47.015731 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.015709 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 19:06:47.032732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.032697 2564 log.go:25] "Validated CRI v1 runtime API" Apr 24 19:06:47.038356 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.038336 2564 log.go:25] "Validated CRI v1 image API" Apr 24 19:06:47.039653 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.039636 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 19:06:47.040050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.040031 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:47.045384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.045356 2564 fs.go:135] Filesystem UUIDs: map[004ab1aa-9f18-4696-a99f-0632093abd83:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 ab379db2-029c-4fbd-b6cb-6b1384a6190d:/dev/nvme0n1p3] Apr 24 19:06:47.045461 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.045383 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 19:06:47.051071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.050957 2564 manager.go:217] Machine: {Timestamp:2026-04-24 19:06:47.049023651 +0000 UTC m=+0.373267805 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100276 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2426f94c5beb871f0ee87551640895 SystemUUID:ec2426f9-4c5b-eb87-1f0e-e87551640895 BootID:a2e04608-da9c-4701-adab-2d42b4f4d7d6 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6d:55:b4:e1:29 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6d:55:b4:e1:29 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2e:0f:c9:3a:b1:de Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 19:06:47.051071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.051066 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 19:06:47.051184 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.051155 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 19:06:47.052202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.052175 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 19:06:47.052352 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.052205 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-124.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 19:06:47.052397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.052363 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 19:06:47.052397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.052372 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 19:06:47.052397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.052385 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:47.053210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.053198 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 19:06:47.053874 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.053864 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:47.054158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.054149 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 19:06:47.056451 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.056441 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 24 19:06:47.056515 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.056460 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 19:06:47.056515 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.056473 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 19:06:47.056515 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.056486 2564 kubelet.go:397] "Adding apiserver pod source" Apr 24 19:06:47.056515 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.056499 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 19:06:47.057458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.057446 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:47.057502 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.057463 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 19:06:47.060385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.060361 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 19:06:47.061606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.061592 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 19:06:47.063132 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063121 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063138 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063144 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063150 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063156 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063162 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063168 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 19:06:47.063175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063173 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 19:06:47.063404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063181 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 19:06:47.063404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063188 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 19:06:47.063404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063196 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 19:06:47.063404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.063205 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 19:06:47.064063 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.064050 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 19:06:47.064063 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.064063 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 19:06:47.067815 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.067792 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 19:06:47.067908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.067844 2564 server.go:1295] "Started kubelet" Apr 24 19:06:47.068451 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.068422 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 19:06:47.068693 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.068643 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 19:06:47.068913 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.068887 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 19:06:47.069275 ip-10-0-129-124 systemd[1]: Started Kubernetes Kubelet. Apr 24 19:06:47.069850 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.069775 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-124.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 19:06:47.069850 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.069837 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 19:06:47.070086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.069958 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-124.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 19:06:47.070086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.070039 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 19:06:47.070418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.070395 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 24 19:06:47.076009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.075608 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 19:06:47.076640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.076290 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.076922 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077003 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077028 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077045 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077121 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077129 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 24 19:06:47.077363 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.076358 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-124.ec2.internal.18a9607a76aec8f5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-124.ec2.internal,UID:ip-10-0-129-124.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-124.ec2.internal,},FirstTimestamp:2026-04-24 19:06:47.067814133 +0000 UTC m=+0.392058292,LastTimestamp:2026-04-24 19:06:47.067814133 +0000 UTC m=+0.392058292,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-124.ec2.internal,}" Apr 24 19:06:47.077668 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077497 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 19:06:47.077668 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077509 2564 factory.go:55] Registering systemd factory Apr 24 19:06:47.077668 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.077517 2564 factory.go:223] Registration of the systemd container factory successfully Apr 24 19:06:47.078703 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.078685 2564 factory.go:153] Registering CRI-O factory Apr 24 19:06:47.078703 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.078705 2564 factory.go:223] Registration of the crio container factory successfully Apr 24 19:06:47.078802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.078730 2564 factory.go:103] Registering Raw factory Apr 24 19:06:47.078802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.078745 2564 manager.go:1196] Started watching for new ooms in manager Apr 24 19:06:47.079198 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.079185 2564 manager.go:319] Starting recovery of all containers Apr 24 19:06:47.088971 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.088781 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-124.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 19:06:47.088971 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.088787 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 19:06:47.088971 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.088918 2564 manager.go:324] Recovery completed Apr 24 19:06:47.094083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.094068 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.096242 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096227 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.096311 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096255 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.096311 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096266 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.096815 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096800 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 19:06:47.096815 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096814 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 19:06:47.096919 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.096832 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 24 19:06:47.098764 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.098752 2564 policy_none.go:49] "None policy: Start" Apr 24 19:06:47.098805 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.098768 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 19:06:47.098805 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.098778 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 24 19:06:47.101463 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.101438 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gn668" Apr 24 19:06:47.106757 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.106690 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-124.ec2.internal.18a9607a7860948a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-124.ec2.internal,UID:ip-10-0-129-124.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-124.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-124.ec2.internal,},FirstTimestamp:2026-04-24 19:06:47.096243338 +0000 UTC m=+0.420487491,LastTimestamp:2026-04-24 19:06:47.096243338 +0000 UTC m=+0.420487491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-124.ec2.internal,}" Apr 24 19:06:47.107814 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.107798 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gn668" Apr 24 19:06:47.139196 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139179 2564 manager.go:341] "Starting Device Plugin manager" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.139212 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139223 2564 server.go:85] "Starting device plugin registration server" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139491 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139506 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139590 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139670 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.139676 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.140441 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 19:06:47.150376 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.140487 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.178272 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.178237 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 19:06:47.179533 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.179514 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 19:06:47.179635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.179547 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 19:06:47.179635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.179591 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 19:06:47.179635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.179601 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 19:06:47.179778 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.179643 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 19:06:47.183077 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.183060 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:47.241364 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.241279 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.242492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.242474 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.242628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.242510 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.242628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.242523 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.242628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.242573 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.251695 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.251675 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.251777 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.251699 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-124.ec2.internal\": node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.269198 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.269171 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.280511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.280477 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal"] Apr 24 19:06:47.280598 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.280568 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.281441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.281425 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.281520 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.281452 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.281520 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.281463 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.282659 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.282647 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.282811 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.282798 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.282847 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.282825 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.283378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283354 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.283485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283387 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.283485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283411 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.283485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283358 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.283485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283455 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.283485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.283466 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.284438 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.284422 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.284520 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.284451 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 19:06:47.285130 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.285116 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientMemory" Apr 24 19:06:47.285221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.285145 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 19:06:47.285221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.285160 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeHasSufficientPID" Apr 24 19:06:47.313201 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.313155 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-124.ec2.internal\" not found" node="ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.317589 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.317573 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-124.ec2.internal\" not found" node="ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.369999 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.369973 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.379357 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.379319 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.379357 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.379362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3af0b9266f4203b6c4070d0308bc061d-config\") pod \"kube-apiserver-proxy-ip-10-0-129-124.ec2.internal\" (UID: \"3af0b9266f4203b6c4070d0308bc061d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.379357 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.379378 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.470711 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.470668 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.480119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480097 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.480198 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480125 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.480198 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3af0b9266f4203b6c4070d0308bc061d-config\") pod \"kube-apiserver-proxy-ip-10-0-129-124.ec2.internal\" (UID: \"3af0b9266f4203b6c4070d0308bc061d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.480273 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480200 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.480273 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e59d31f4a1c1d4c0169e102c535467de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal\" (UID: \"e59d31f4a1c1d4c0169e102c535467de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.480346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.480291 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/3af0b9266f4203b6c4070d0308bc061d-config\") pod \"kube-apiserver-proxy-ip-10-0-129-124.ec2.internal\" (UID: \"3af0b9266f4203b6c4070d0308bc061d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.571602 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.571493 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.615000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.614962 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.619446 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.619431 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:47.672021 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.671985 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.772530 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.772496 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.873039 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.872970 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.973494 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:47.973460 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:47.988886 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.988861 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 19:06:47.989029 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:47.988989 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 19:06:48.074121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.074090 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.076970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.076956 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 19:06:48.092024 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.091999 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 19:06:48.101290 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:48.101255 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3af0b9266f4203b6c4070d0308bc061d.slice/crio-8a96e0c44cdc70ed37153fb646c73b77b63bc894bececa3d86eaca0aa258764e WatchSource:0}: Error finding container 8a96e0c44cdc70ed37153fb646c73b77b63bc894bececa3d86eaca0aa258764e: Status 404 returned error can't find the container with id 8a96e0c44cdc70ed37153fb646c73b77b63bc894bececa3d86eaca0aa258764e Apr 24 19:06:48.101832 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:48.101811 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode59d31f4a1c1d4c0169e102c535467de.slice/crio-3297404fa4bbc82b03ce6f4b7267995ddeffa1b9696e9e4feae0e1d9aeb05545 WatchSource:0}: Error finding container 3297404fa4bbc82b03ce6f4b7267995ddeffa1b9696e9e4feae0e1d9aeb05545: Status 404 returned error can't find the container with id 3297404fa4bbc82b03ce6f4b7267995ddeffa1b9696e9e4feae0e1d9aeb05545 Apr 24 19:06:48.106303 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.106287 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:06:48.110120 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.110094 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 19:01:47 +0000 UTC" deadline="2027-12-13 18:36:01.739871865 +0000 UTC" Apr 24 19:06:48.110120 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.110118 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14351h29m13.629756826s" Apr 24 19:06:48.111217 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.111204 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:48.132762 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.132682 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqssr" Apr 24 19:06:48.143521 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.143494 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bqssr" Apr 24 19:06:48.175026 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.174995 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.183178 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.183133 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" event={"ID":"e59d31f4a1c1d4c0169e102c535467de","Type":"ContainerStarted","Data":"3297404fa4bbc82b03ce6f4b7267995ddeffa1b9696e9e4feae0e1d9aeb05545"} Apr 24 19:06:48.184000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.183977 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" event={"ID":"3af0b9266f4203b6c4070d0308bc061d","Type":"ContainerStarted","Data":"8a96e0c44cdc70ed37153fb646c73b77b63bc894bececa3d86eaca0aa258764e"} Apr 24 19:06:48.275154 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.275119 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.375621 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.375590 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.476129 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.476101 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.576237 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:48.576205 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-124.ec2.internal\" not found" Apr 24 19:06:48.583162 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.583132 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:48.668279 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.668250 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:48.677025 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.676822 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" Apr 24 19:06:48.691262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.691231 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:48.692204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.692183 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" Apr 24 19:06:48.701491 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:48.701467 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 19:06:49.057642 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.057610 2564 apiserver.go:52] "Watching apiserver" Apr 24 19:06:49.063749 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.063722 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 19:06:49.065051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.065025 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7","openshift-cluster-node-tuning-operator/tuned-ctptc","openshift-image-registry/node-ca-nbt8b","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal","openshift-multus/multus-8svws","openshift-multus/network-metrics-daemon-p2bz2","openshift-network-operator/iptables-alerter-hc4w2","kube-system/konnectivity-agent-6wkx2","openshift-multus/multus-additional-cni-plugins-rb2v6","openshift-network-diagnostics/network-check-target-mdmw5","openshift-ovn-kubernetes/ovnkube-node-d6t5k"] Apr 24 19:06:49.066789 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.066769 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.067892 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.067869 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.069408 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.069389 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.069640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.069599 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.069640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.069630 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 19:06:49.069764 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.069729 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-lgxvb\"" Apr 24 19:06:49.070423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.070176 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.070423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.070200 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.070423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.070286 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-xkz9p\"" Apr 24 19:06:49.070423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.070304 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.071506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.071488 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svws" Apr 24 19:06:49.072505 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.072361 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.072505 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.072460 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.072714 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.072509 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 19:06:49.072714 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.072628 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-52zwb\"" Apr 24 19:06:49.072830 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.072756 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.072886 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.072828 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:06:49.073721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.073692 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 19:06:49.073809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.073766 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.073809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.073771 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 19:06:49.074488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.073982 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.074488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.073985 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.074488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.074122 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xcndp\"" Apr 24 19:06:49.074488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.074132 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.075354 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.075334 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.076471 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.076640 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.076884 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mzd8w\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.076919 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.076937 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.077345 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.077736 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-pgvc8\"" Apr 24 19:06:49.077993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.077963 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 19:06:49.078376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.078144 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-cpffq\"" Apr 24 19:06:49.078376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.078229 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 19:06:49.079935 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.079825 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:49.079935 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.079896 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:49.079935 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.079913 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.082819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.082662 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 19:06:49.082819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.082747 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 19:06:49.082930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.082871 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 19:06:49.083205 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.083187 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 19:06:49.083363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.083345 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 19:06:49.083441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.083373 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sd4tx\"" Apr 24 19:06:49.083441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.083430 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 19:06:49.088758 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.088873 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088771 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-var-lib-kubelet\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.088873 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088795 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-host\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.088873 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088818 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-etc-tuned\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.088873 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088867 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-multus\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088901 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-system-cni-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088964 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.088998 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-kubernetes\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089024 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-hostroot\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089048 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-daemon-config\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089074 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtbvt\" (UniqueName: \"kubernetes.io/projected/9a8daf37-7548-4dd2-ba26-c79a7de10480-kube-api-access-gtbvt\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/07894671-8fdc-4378-8129-c0529b896ce6-agent-certs\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089136 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089169 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089191 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089217 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/07894671-8fdc-4378-8129-c0529b896ce6-konnectivity-ca\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089281 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-modprobe-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089304 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-etc-kubernetes\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089346 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb8c4\" (UniqueName: \"kubernetes.io/projected/519a2b19-a52e-492e-b937-624360a7c8ad-kube-api-access-xb8c4\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089387 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089415 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089442 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/9104b10b-fe94-4977-b556-addf9a7f232f-kube-api-access-h72tv\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089464 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-conf-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089491 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cnibin\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089515 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089538 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-socket-dir-parent\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089582 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysconfig\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-cnibin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089628 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/519a2b19-a52e-492e-b937-624360a7c8ad-host-slash\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089654 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vz8f\" (UniqueName: \"kubernetes.io/projected/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kube-api-access-2vz8f\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089677 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-systemd\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-host\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089745 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-tmp\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089762 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7b8kg\" (UniqueName: \"kubernetes.io/projected/2017f643-59f9-484c-aabe-6af06168539a-kube-api-access-7b8kg\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.089794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089788 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-bin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wpp\" (UniqueName: \"kubernetes.io/projected/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-kube-api-access-v7wpp\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089841 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-device-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089862 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-sys\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089891 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-os-release\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089913 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-kubelet\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089933 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.089958 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/519a2b19-a52e-492e-b937-624360a7c8ad-iptables-alerter-script\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090047 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-serviceca\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090071 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-netns\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090096 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090119 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090142 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-run\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090167 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7n96\" (UniqueName: \"kubernetes.io/projected/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-kube-api-access-t7n96\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090192 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-k8s-cni-cncf-io\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.090418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090217 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-conf\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.091086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-lib-modules\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.091086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-system-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.091086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090322 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-cni-binary-copy\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.091086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090342 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-multus-certs\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.091086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.090367 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-os-release\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.144797 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.144757 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:48 +0000 UTC" deadline="2027-11-10 00:41:48.720532571 +0000 UTC" Apr 24 19:06:49.144797 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.144790 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13541h34m59.575746537s" Apr 24 19:06:49.177922 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.177887 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 19:06:49.190943 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.190905 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-k8s-cni-cncf-io\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.190952 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.190976 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccfa5733-74cd-4083-833e-376f6fc796e2-ovn-node-metrics-cert\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191002 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-conf\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191032 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-k8s-cni-cncf-io\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-lib-modules\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191092 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-system-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191113 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-cni-binary-copy\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-multus-certs\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191149 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-conf\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191179 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-lib-modules\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191183 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-multus-certs\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191154 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-os-release\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191220 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-os-release\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-system-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191229 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191260 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-var-lib-kubelet\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-host\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191312 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-bin\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191343 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-etc-tuned\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191345 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-socket-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191379 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-multus\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191387 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-host\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191407 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-netd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.191476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191434 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-var-lib-kubelet\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191434 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-system-cni-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191468 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-multus\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191497 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-kubernetes\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191505 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-system-cni-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191521 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-hostroot\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-daemon-config\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191577 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-kubernetes\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191585 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtbvt\" (UniqueName: \"kubernetes.io/projected/9a8daf37-7548-4dd2-ba26-c79a7de10480-kube-api-access-gtbvt\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191615 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-hostroot\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191615 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-kubelet\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191650 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-var-lib-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191679 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/07894671-8fdc-4378-8129-c0529b896ce6-agent-certs\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191702 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191695 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191750 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.192267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191798 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-netns\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191838 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191868 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/07894671-8fdc-4378-8129-c0529b896ce6-konnectivity-ca\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191888 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-cni-binary-copy\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191942 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191943 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysctl-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191891 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.191976 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-modprobe-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192093 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-daemon-config\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192111 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192135 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-sys-fs\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192175 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-etc-kubernetes\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb8c4\" (UniqueName: \"kubernetes.io/projected/519a2b19-a52e-492e-b937-624360a7c8ad-kube-api-access-xb8c4\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192263 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-modprobe-d\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192265 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-etc-kubernetes\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.192966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-log-socket\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192396 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fnp5\" (UniqueName: \"kubernetes.io/projected/ccfa5733-74cd-4083-833e-376f6fc796e2-kube-api-access-4fnp5\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192416 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/07894671-8fdc-4378-8129-c0529b896ce6-konnectivity-ca\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192446 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192531 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-registration-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192593 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/9104b10b-fe94-4977-b556-addf9a7f232f-kube-api-access-h72tv\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-conf-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192664 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-etc-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192693 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-env-overrides\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192720 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cnibin\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192747 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-socket-dir-parent\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192799 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-node-log\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192821 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysconfig\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192844 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-cnibin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/519a2b19-a52e-492e-b937-624360a7c8ad-host-slash\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.193765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192876 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vz8f\" (UniqueName: \"kubernetes.io/projected/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kube-api-access-2vz8f\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192905 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-systemd\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192923 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-cni-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192929 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cni-binary-copy\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192928 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-host\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192999 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-host\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.192999 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-socket-dir-parent\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193006 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-cnibin\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193013 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-cnibin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193023 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/519a2b19-a52e-492e-b937-624360a7c8ad-host-slash\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193032 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-tmp\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193065 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-multus-conf-dir\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-sysconfig\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193083 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7b8kg\" (UniqueName: \"kubernetes.io/projected/2017f643-59f9-484c-aabe-6af06168539a-kube-api-access-7b8kg\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193101 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-bin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-cni-bin\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193179 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-etc-systemd\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193270 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-systemd-units\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.194546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193304 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wpp\" (UniqueName: \"kubernetes.io/projected/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-kube-api-access-v7wpp\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193332 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-systemd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-device-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193385 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-sys\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193410 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-os-release\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193435 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-kubelet\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193456 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-device-dir\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193459 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-slash\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193501 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-var-lib-kubelet\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193504 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-os-release\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193538 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193572 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-sys\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193605 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193638 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/519a2b19-a52e-492e-b937-624360a7c8ad-iptables-alerter-script\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193770 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193800 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-serviceca\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193835 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-netns\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.195325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193863 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-script-lib\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193921 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-ovn\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193944 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-config\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193970 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.193993 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-run\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194019 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7n96\" (UniqueName: \"kubernetes.io/projected/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-kube-api-access-t7n96\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194229 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-serviceca\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194295 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/9a8daf37-7548-4dd2-ba26-c79a7de10480-host-run-netns\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.194409 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.194485 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:49.694458057 +0000 UTC m=+3.018702212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194730 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194777 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-etc-selinux\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194797 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2017f643-59f9-484c-aabe-6af06168539a-run\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.194846 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/519a2b19-a52e-492e-b937-624360a7c8ad-iptables-alerter-script\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.195603 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-tmp\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.195793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/2017f643-59f9-484c-aabe-6af06168539a-etc-tuned\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.196058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.195885 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/07894671-8fdc-4378-8129-c0529b896ce6-agent-certs\") pod \"konnectivity-agent-6wkx2\" (UID: \"07894671-8fdc-4378-8129-c0529b896ce6\") " pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.200670 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.200644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtbvt\" (UniqueName: \"kubernetes.io/projected/9a8daf37-7548-4dd2-ba26-c79a7de10480-kube-api-access-gtbvt\") pod \"multus-8svws\" (UID: \"9a8daf37-7548-4dd2-ba26-c79a7de10480\") " pod="openshift-multus/multus-8svws" Apr 24 19:06:49.200900 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.200837 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:49.200900 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.200860 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:49.200900 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.200875 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:49.200900 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.200892 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb8c4\" (UniqueName: \"kubernetes.io/projected/519a2b19-a52e-492e-b937-624360a7c8ad-kube-api-access-xb8c4\") pod \"iptables-alerter-hc4w2\" (UID: \"519a2b19-a52e-492e-b937-624360a7c8ad\") " pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.201167 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.200935 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:49.700917762 +0000 UTC m=+3.025161906 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:49.201167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.201006 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72tv\" (UniqueName: \"kubernetes.io/projected/9104b10b-fe94-4977-b556-addf9a7f232f-kube-api-access-h72tv\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.203211 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.203189 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7n96\" (UniqueName: \"kubernetes.io/projected/e51af979-d2ff-49ba-8e4e-7620a2a4cd7e-kube-api-access-t7n96\") pod \"node-ca-nbt8b\" (UID: \"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e\") " pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.203518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.203470 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vz8f\" (UniqueName: \"kubernetes.io/projected/6d5683da-deea-4d8c-aab6-4b62f83fa8d2-kube-api-access-2vz8f\") pod \"aws-ebs-csi-driver-node-65xn7\" (UID: \"6d5683da-deea-4d8c-aab6-4b62f83fa8d2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.204114 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.204066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wpp\" (UniqueName: \"kubernetes.io/projected/e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d-kube-api-access-v7wpp\") pod \"multus-additional-cni-plugins-rb2v6\" (UID: \"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d\") " pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.204369 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.204350 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7b8kg\" (UniqueName: \"kubernetes.io/projected/2017f643-59f9-484c-aabe-6af06168539a-kube-api-access-7b8kg\") pod \"tuned-ctptc\" (UID: \"2017f643-59f9-484c-aabe-6af06168539a\") " pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.295244 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295210 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-kubelet\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-var-lib-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295277 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-netns\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295300 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295320 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-kubelet\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295325 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-log-socket\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295366 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-log-socket\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295387 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fnp5\" (UniqueName: \"kubernetes.io/projected/ccfa5733-74cd-4083-833e-376f6fc796e2-kube-api-access-4fnp5\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295388 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-var-lib-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295391 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295416 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-etc-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295419 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-netns\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295439 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-env-overrides\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-etc-openvswitch\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295506 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-node-log\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-node-log\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295571 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-systemd-units\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295590 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-systemd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295609 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-slash\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295627 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-systemd-units\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295669 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-systemd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295692 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295695 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-slash\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295656 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-script-lib\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295746 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-ovn\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295763 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-config\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295791 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.295940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295811 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-run-ovn\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295825 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccfa5733-74cd-4083-833e-376f6fc796e2-ovn-node-metrics-cert\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295852 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-run-ovn-kubernetes\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295861 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-bin\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295918 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-netd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295962 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-bin\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295980 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/ccfa5733-74cd-4083-833e-376f6fc796e2-host-cni-netd\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.295990 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-env-overrides\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.296292 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-script-lib\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.296479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.296318 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/ccfa5733-74cd-4083-833e-376f6fc796e2-ovnkube-config\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.298415 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.298394 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/ccfa5733-74cd-4083-833e-376f6fc796e2-ovn-node-metrics-cert\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.305766 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.305746 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fnp5\" (UniqueName: \"kubernetes.io/projected/ccfa5733-74cd-4083-833e-376f6fc796e2-kube-api-access-4fnp5\") pod \"ovnkube-node-d6t5k\" (UID: \"ccfa5733-74cd-4083-833e-376f6fc796e2\") " pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.348941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.348872 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 19:06:49.379969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.379932 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" Apr 24 19:06:49.386854 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.386827 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ctptc" Apr 24 19:06:49.395141 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.395121 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nbt8b" Apr 24 19:06:49.400832 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.400815 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-8svws" Apr 24 19:06:49.407388 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.407369 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hc4w2" Apr 24 19:06:49.412904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.412884 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:06:49.419451 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.419427 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" Apr 24 19:06:49.425119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.425100 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:06:49.698145 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.698116 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:49.698325 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.698291 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:49.698410 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.698381 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:50.698360332 +0000 UTC m=+4.022604491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:49.740596 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:49.740506 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod519a2b19_a52e_492e_b937_624360a7c8ad.slice/crio-593d26a9d252705c17b0b05228a18385ed04baae991dea39abdd59a2c8729991 WatchSource:0}: Error finding container 593d26a9d252705c17b0b05228a18385ed04baae991dea39abdd59a2c8729991: Status 404 returned error can't find the container with id 593d26a9d252705c17b0b05228a18385ed04baae991dea39abdd59a2c8729991 Apr 24 19:06:49.741786 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:49.741708 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2017f643_59f9_484c_aabe_6af06168539a.slice/crio-724c8b2a18eb4387bdb79d2db6374928475546b5f3208977d01eec9cd5e15a9b WatchSource:0}: Error finding container 724c8b2a18eb4387bdb79d2db6374928475546b5f3208977d01eec9cd5e15a9b: Status 404 returned error can't find the container with id 724c8b2a18eb4387bdb79d2db6374928475546b5f3208977d01eec9cd5e15a9b Apr 24 19:06:49.742478 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:49.742443 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode51af979_d2ff_49ba_8e4e_7620a2a4cd7e.slice/crio-47dcad70998f10c049ce4c813b259d47b72c6898ed88a8be632ee7e9ef6cdfcb WatchSource:0}: Error finding container 47dcad70998f10c049ce4c813b259d47b72c6898ed88a8be632ee7e9ef6cdfcb: Status 404 returned error can't find the container with id 47dcad70998f10c049ce4c813b259d47b72c6898ed88a8be632ee7e9ef6cdfcb Apr 24 19:06:49.751405 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:49.750779 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d5683da_deea_4d8c_aab6_4b62f83fa8d2.slice/crio-a04bfdd83219de6b9209c6b6afd8624d069bbb6f8dfe988e9dd17bb5279754db WatchSource:0}: Error finding container a04bfdd83219de6b9209c6b6afd8624d069bbb6f8dfe988e9dd17bb5279754db: Status 404 returned error can't find the container with id a04bfdd83219de6b9209c6b6afd8624d069bbb6f8dfe988e9dd17bb5279754db Apr 24 19:06:49.751405 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:06:49.751293 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8daf37_7548_4dd2_ba26_c79a7de10480.slice/crio-ea07542c24558afdc6017ce552bcd394450a0d6d37af882aba2df5a63c8492d7 WatchSource:0}: Error finding container ea07542c24558afdc6017ce552bcd394450a0d6d37af882aba2df5a63c8492d7: Status 404 returned error can't find the container with id ea07542c24558afdc6017ce552bcd394450a0d6d37af882aba2df5a63c8492d7 Apr 24 19:06:49.798539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:49.798396 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:49.798663 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.798539 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:49.798663 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.798576 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:49.798663 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.798586 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:49.798663 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:49.798635 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:50.798618446 +0000 UTC m=+4.122862602 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:50.145531 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.145388 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 19:01:48 +0000 UTC" deadline="2027-12-21 05:42:07.57045293 +0000 UTC" Apr 24 19:06:50.145531 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.145428 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14530h35m17.425028971s" Apr 24 19:06:50.202407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.202367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" event={"ID":"3af0b9266f4203b6c4070d0308bc061d","Type":"ContainerStarted","Data":"fba29044f321278ec18ad4e94c25a21bde9b7dcf335cb95f3146cb245ffad388"} Apr 24 19:06:50.211067 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.210986 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6wkx2" event={"ID":"07894671-8fdc-4378-8129-c0529b896ce6","Type":"ContainerStarted","Data":"25d904e08122a3da4160808d2d749b01cf2d19558dc11893fda38758e7c879f1"} Apr 24 19:06:50.217438 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.216993 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-124.ec2.internal" podStartSLOduration=2.21697537 podStartE2EDuration="2.21697537s" podCreationTimestamp="2026-04-24 19:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:50.216518332 +0000 UTC m=+3.540762496" watchObservedRunningTime="2026-04-24 19:06:50.21697537 +0000 UTC m=+3.541219534" Apr 24 19:06:50.217643 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.217594 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"7bf53324c87bec5336e6dfcd056e276d7c559359f1bc486facb21a2b6e78a760"} Apr 24 19:06:50.224569 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.224483 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ctptc" event={"ID":"2017f643-59f9-484c-aabe-6af06168539a","Type":"ContainerStarted","Data":"724c8b2a18eb4387bdb79d2db6374928475546b5f3208977d01eec9cd5e15a9b"} Apr 24 19:06:50.227188 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.227101 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svws" event={"ID":"9a8daf37-7548-4dd2-ba26-c79a7de10480","Type":"ContainerStarted","Data":"ea07542c24558afdc6017ce552bcd394450a0d6d37af882aba2df5a63c8492d7"} Apr 24 19:06:50.229303 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.229243 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" event={"ID":"6d5683da-deea-4d8c-aab6-4b62f83fa8d2","Type":"ContainerStarted","Data":"a04bfdd83219de6b9209c6b6afd8624d069bbb6f8dfe988e9dd17bb5279754db"} Apr 24 19:06:50.234978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.234913 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerStarted","Data":"e0072f03d8fda583520eb68edab88c23de10a0df2eebd82e6d759786e64e2d26"} Apr 24 19:06:50.238658 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.238593 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nbt8b" event={"ID":"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e","Type":"ContainerStarted","Data":"47dcad70998f10c049ce4c813b259d47b72c6898ed88a8be632ee7e9ef6cdfcb"} Apr 24 19:06:50.240546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.240525 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hc4w2" event={"ID":"519a2b19-a52e-492e-b937-624360a7c8ad","Type":"ContainerStarted","Data":"593d26a9d252705c17b0b05228a18385ed04baae991dea39abdd59a2c8729991"} Apr 24 19:06:50.704928 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.704898 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:50.705060 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.705043 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:50.705131 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.705112 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:52.705093031 +0000 UTC m=+6.029337178 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:50.806121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:50.805265 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:50.806121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.805456 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:50.806121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.805477 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:50.806121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.805490 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:50.806121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:50.805608 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:52.805588234 +0000 UTC m=+6.129832391 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:51.180846 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:51.180225 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:51.180846 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:51.180266 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:51.180846 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:51.180375 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:06:51.180846 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:51.180492 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:51.258165 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:51.258128 2564 generic.go:358] "Generic (PLEG): container finished" podID="e59d31f4a1c1d4c0169e102c535467de" containerID="5b8c889a1a0e9c84f34c146fbd0667048f6be241b6d5fa4a8ff56597cd049e82" exitCode=0 Apr 24 19:06:51.258672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:51.258612 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" event={"ID":"e59d31f4a1c1d4c0169e102c535467de","Type":"ContainerDied","Data":"5b8c889a1a0e9c84f34c146fbd0667048f6be241b6d5fa4a8ff56597cd049e82"} Apr 24 19:06:52.264733 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:52.264633 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" event={"ID":"e59d31f4a1c1d4c0169e102c535467de","Type":"ContainerStarted","Data":"4d12c45be1f6f4ad00a372057060f5b9d554dd02465a1036be93b1b134102b93"} Apr 24 19:06:52.724511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:52.723944 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:52.724511 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.724094 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:52.724511 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.724159 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:06:56.724137683 +0000 UTC m=+10.048381829 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:52.825483 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:52.824863 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:52.825483 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.825063 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:52.825483 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.825080 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:52.825483 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.825092 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:52.825483 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:52.825148 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:06:56.825130166 +0000 UTC m=+10.149374309 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:53.180977 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:53.180889 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:53.181126 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:53.181014 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:53.181410 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:53.181390 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:53.181536 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:53.181496 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:06:55.180859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:55.180817 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:55.180859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:55.180824 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:55.181382 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:55.180967 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:06:55.181382 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:55.181172 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:56.754674 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:56.754637 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:56.755148 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.754806 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:56.755148 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.754887 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:04.754864596 +0000 UTC m=+18.079108754 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:06:56.855680 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:56.855583 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:56.855853 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.855771 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:06:56.855853 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.855794 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:06:56.855853 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.855808 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:56.856028 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:56.855869 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:04.855849615 +0000 UTC m=+18.180093762 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:06:57.181783 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.181335 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:57.181783 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:57.181473 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:06:57.181783 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.181545 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:57.181783 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:57.181690 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:57.259844 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.258692 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-124.ec2.internal" podStartSLOduration=9.258671047 podStartE2EDuration="9.258671047s" podCreationTimestamp="2026-04-24 19:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:06:52.283677012 +0000 UTC m=+5.607921176" watchObservedRunningTime="2026-04-24 19:06:57.258671047 +0000 UTC m=+10.582915213" Apr 24 19:06:57.259844 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.259607 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-j4w5m"] Apr 24 19:06:57.265776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.265748 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.270188 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.270159 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-jf8c5\"" Apr 24 19:06:57.270460 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.270438 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 19:06:57.271412 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.270656 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 19:06:57.360117 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.360068 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cccxc\" (UniqueName: \"kubernetes.io/projected/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-kube-api-access-cccxc\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.360281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.360221 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-tmp-dir\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.360281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.360275 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-hosts-file\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.461729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.461213 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-hosts-file\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.461729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.461274 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cccxc\" (UniqueName: \"kubernetes.io/projected/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-kube-api-access-cccxc\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.461729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.461373 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-hosts-file\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.461729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.461390 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-tmp-dir\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.462064 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.461857 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-tmp-dir\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.474376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.474312 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cccxc\" (UniqueName: \"kubernetes.io/projected/6d09a5ce-72ac-4dea-826a-c4a0beb36ce3-kube-api-access-cccxc\") pod \"node-resolver-j4w5m\" (UID: \"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3\") " pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:57.577776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:57.577739 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j4w5m" Apr 24 19:06:59.180303 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:59.180263 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:06:59.180303 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:06:59.180303 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:06:59.180871 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:59.180387 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:06:59.180871 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:06:59.180509 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:01.180317 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:01.180277 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:01.180317 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:01.180308 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:01.180882 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:01.180404 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:01.180882 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:01.180525 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:03.180842 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:03.180802 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:03.180842 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:03.180827 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:03.181394 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:03.180941 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:03.181394 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:03.181038 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:04.813834 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:04.813795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:04.814277 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.813964 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:04.814277 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.814040 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.814021719 +0000 UTC m=+34.138265862 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:04.914153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:04.914113 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:04.914324 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.914294 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:04.914324 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.914315 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:04.914324 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.914325 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:04.914496 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:04.914391 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.914374696 +0000 UTC m=+34.238618848 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:05.180202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:05.180125 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:05.180202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:05.180178 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:05.180400 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:05.180243 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:05.180400 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:05.180386 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:07.181230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.180858 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:07.182088 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.180916 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:07.182088 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:07.181278 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:07.182088 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:07.181340 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:07.292498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292459 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"853896bef67b2879140db085bf02c4904146102bb9591c15f446ecdc17b1b8b9"} Apr 24 19:07:07.292498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292501 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"794b27df64e53ee15fdcb42989afc34a63c9a8cd757218a7ce929aa3ae7727ca"} Apr 24 19:07:07.292732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292512 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"a404a693fd4f9c930488d1eba90a6788a2a3b6d569585bf2ce64662d468e98bc"} Apr 24 19:07:07.292732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292520 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"6a7053433c373c2f42411c1c947d7c4aea348ee168fc73dc40a108b64e59963f"} Apr 24 19:07:07.292732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292534 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"b755ed46fbae0746ca5e5f7cf97dc729c74c4fbda0c06943614f09fff64ae5d0"} Apr 24 19:07:07.292732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.292583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"6b97a1eb190ec46fefdcd189d086aa888db48382be229ad4c512da20c75d1935"} Apr 24 19:07:07.293751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.293722 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ctptc" event={"ID":"2017f643-59f9-484c-aabe-6af06168539a","Type":"ContainerStarted","Data":"c53c305313b6792a797aa3fc6ee21b5657537b16a3f37f8b835868f542f88022"} Apr 24 19:07:07.294956 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.294932 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j4w5m" event={"ID":"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3","Type":"ContainerStarted","Data":"09d3705de4d744e7819d64c8bdba791320a8e5e990ab743198718b1705044410"} Apr 24 19:07:07.294956 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.294955 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j4w5m" event={"ID":"6d09a5ce-72ac-4dea-826a-c4a0beb36ce3","Type":"ContainerStarted","Data":"b3b30a37d9de924435f9700c69f2734bd72d502e9204ad98f3732f6bc0f14e89"} Apr 24 19:07:07.296196 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.296174 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-8svws" event={"ID":"9a8daf37-7548-4dd2-ba26-c79a7de10480","Type":"ContainerStarted","Data":"a9ce9e39755094cf03081686f17b7f4ada77d358b61c1564f2da4855f7d5cce7"} Apr 24 19:07:07.297368 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.297348 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" event={"ID":"6d5683da-deea-4d8c-aab6-4b62f83fa8d2","Type":"ContainerStarted","Data":"190b29b969102c929e1845a2b35553fd6ffcdceb98ffbc0552f52cce541daeb6"} Apr 24 19:07:07.298545 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.298525 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="c09f2bc4822f47eb1c04cbb22e624256f7442844b000b305b2538c03713a3b6a" exitCode=0 Apr 24 19:07:07.298638 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.298583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"c09f2bc4822f47eb1c04cbb22e624256f7442844b000b305b2538c03713a3b6a"} Apr 24 19:07:07.299957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.299800 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nbt8b" event={"ID":"e51af979-d2ff-49ba-8e4e-7620a2a4cd7e","Type":"ContainerStarted","Data":"7be13e77af853340522a72eaf88b98b5d539770a92746b8abd198fdaaa34fcbc"} Apr 24 19:07:07.300923 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.300906 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6wkx2" event={"ID":"07894671-8fdc-4378-8129-c0529b896ce6","Type":"ContainerStarted","Data":"05a513b43080b862b02d56819fc29efae4c6ad12c6e7746632d2fbe8bd1e41ec"} Apr 24 19:07:07.316887 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.316846 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ctptc" podStartSLOduration=3.754058829 podStartE2EDuration="20.316836145s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.743478817 +0000 UTC m=+3.067722958" lastFinishedPulling="2026-04-24 19:07:06.306256127 +0000 UTC m=+19.630500274" observedRunningTime="2026-04-24 19:07:07.316605304 +0000 UTC m=+20.640849469" watchObservedRunningTime="2026-04-24 19:07:07.316836145 +0000 UTC m=+20.641080308" Apr 24 19:07:07.319863 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.319843 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-v6hxv"] Apr 24 19:07:07.322560 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.322534 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.322624 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:07.322609 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:07.339685 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.339647 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-8svws" podStartSLOduration=3.7489547610000002 podStartE2EDuration="20.339634817s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.753897125 +0000 UTC m=+3.078141267" lastFinishedPulling="2026-04-24 19:07:06.344577176 +0000 UTC m=+19.668821323" observedRunningTime="2026-04-24 19:07:07.339593789 +0000 UTC m=+20.663837952" watchObservedRunningTime="2026-04-24 19:07:07.339634817 +0000 UTC m=+20.663878979" Apr 24 19:07:07.378886 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.378842 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6wkx2" podStartSLOduration=3.90933106 podStartE2EDuration="20.378827471s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.746596972 +0000 UTC m=+3.070841113" lastFinishedPulling="2026-04-24 19:07:06.216093378 +0000 UTC m=+19.540337524" observedRunningTime="2026-04-24 19:07:07.360166655 +0000 UTC m=+20.684410820" watchObservedRunningTime="2026-04-24 19:07:07.378827471 +0000 UTC m=+20.703071646" Apr 24 19:07:07.396698 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.396656 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j4w5m" podStartSLOduration=10.396641253 podStartE2EDuration="10.396641253s" podCreationTimestamp="2026-04-24 19:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:07:07.396331786 +0000 UTC m=+20.720575953" watchObservedRunningTime="2026-04-24 19:07:07.396641253 +0000 UTC m=+20.720885426" Apr 24 19:07:07.412951 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.412909 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nbt8b" podStartSLOduration=3.941745138 podStartE2EDuration="20.412895675s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.744874157 +0000 UTC m=+3.069118301" lastFinishedPulling="2026-04-24 19:07:06.216024693 +0000 UTC m=+19.540268838" observedRunningTime="2026-04-24 19:07:07.412403352 +0000 UTC m=+20.736647519" watchObservedRunningTime="2026-04-24 19:07:07.412895675 +0000 UTC m=+20.737139837" Apr 24 19:07:07.431167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.431140 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.431296 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.431250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-dbus\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.431693 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.431668 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-kubelet-config\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.456220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.456194 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 19:07:07.532285 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.532255 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.532410 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.532307 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-dbus\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.532410 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.532331 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-kubelet-config\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.532410 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:07.532384 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:07.532547 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.532411 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-kubelet-config\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:07.532547 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:07.532443 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret podName:62836674-92b3-4b2c-a4c9-e6896f0ff8fa nodeName:}" failed. No retries permitted until 2026-04-24 19:07:08.032424076 +0000 UTC m=+21.356668221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret") pod "global-pull-secret-syncer-v6hxv" (UID: "62836674-92b3-4b2c-a4c9-e6896f0ff8fa") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:07.532547 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:07.532521 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-dbus\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:08.036374 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.036340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:08.036602 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:08.036496 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:08.036602 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:08.036581 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret podName:62836674-92b3-4b2c-a4c9-e6896f0ff8fa nodeName:}" failed. No retries permitted until 2026-04-24 19:07:09.036546494 +0000 UTC m=+22.360790652 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret") pod "global-pull-secret-syncer-v6hxv" (UID: "62836674-92b3-4b2c-a4c9-e6896f0ff8fa") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:08.152229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.152081 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T19:07:07.456215563Z","UUID":"e9b5ad5a-5ad7-46e4-b2cc-da1881b096ca","Handler":null,"Name":"","Endpoint":""} Apr 24 19:07:08.155667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.155639 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 19:07:08.155667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.155672 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 19:07:08.305682 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.305599 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" event={"ID":"6d5683da-deea-4d8c-aab6-4b62f83fa8d2","Type":"ContainerStarted","Data":"36f82e81bca04aa887f1659fdde0f5cfa0a3759f0837e577bc93ee41d7d7df62"} Apr 24 19:07:08.307540 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.307508 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hc4w2" event={"ID":"519a2b19-a52e-492e-b937-624360a7c8ad","Type":"ContainerStarted","Data":"9ad2ee1345f8774553772541f0b537208a66f3da15bc6eba5bdbef2723821ccb"} Apr 24 19:07:08.538974 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.538937 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:07:08.539689 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.539667 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:07:08.559596 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:08.559491 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hc4w2" podStartSLOduration=4.995494465 podStartE2EDuration="21.559474634s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.742175227 +0000 UTC m=+3.066419368" lastFinishedPulling="2026-04-24 19:07:06.306155385 +0000 UTC m=+19.630399537" observedRunningTime="2026-04-24 19:07:08.334143926 +0000 UTC m=+21.658388089" watchObservedRunningTime="2026-04-24 19:07:08.559474634 +0000 UTC m=+21.883718797" Apr 24 19:07:09.044621 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.044430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:09.044782 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:09.044606 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:09.044782 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:09.044735 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret podName:62836674-92b3-4b2c-a4c9-e6896f0ff8fa nodeName:}" failed. No retries permitted until 2026-04-24 19:07:11.044720233 +0000 UTC m=+24.368964374 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret") pod "global-pull-secret-syncer-v6hxv" (UID: "62836674-92b3-4b2c-a4c9-e6896f0ff8fa") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:09.179945 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.179832 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:09.180121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:09.179953 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:09.180121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.179841 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:09.180121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.179839 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:09.180121 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:09.180048 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:09.180339 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:09.180132 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:09.311120 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.311022 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" event={"ID":"6d5683da-deea-4d8c-aab6-4b62f83fa8d2","Type":"ContainerStarted","Data":"6d88b65cfe2beaca94b718e2cc6478883bb276129c51df0c6f8bb6a18c5766da"} Apr 24 19:07:09.314220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:09.314189 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"178eb34992aa36c86397b44c6d27fc08ef0a476a37dfd17a25d59f1cec679325"} Apr 24 19:07:10.316318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:10.316285 2564 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 19:07:10.795619 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:10.795589 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:07:10.796270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:10.796243 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6wkx2" Apr 24 19:07:10.810996 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:10.810942 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-65xn7" podStartSLOduration=5.338591779 podStartE2EDuration="23.810924664s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.752785322 +0000 UTC m=+3.077029463" lastFinishedPulling="2026-04-24 19:07:08.225118194 +0000 UTC m=+21.549362348" observedRunningTime="2026-04-24 19:07:09.335583295 +0000 UTC m=+22.659827458" watchObservedRunningTime="2026-04-24 19:07:10.810924664 +0000 UTC m=+24.135168830" Apr 24 19:07:11.059001 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.058915 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:11.059138 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:11.059079 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:11.059220 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:11.059151 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret podName:62836674-92b3-4b2c-a4c9-e6896f0ff8fa nodeName:}" failed. No retries permitted until 2026-04-24 19:07:15.059134517 +0000 UTC m=+28.383378658 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret") pod "global-pull-secret-syncer-v6hxv" (UID: "62836674-92b3-4b2c-a4c9-e6896f0ff8fa") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:11.180566 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.180518 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:11.180566 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.180539 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:11.180733 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.180575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:11.180733 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:11.180675 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:11.180818 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:11.180783 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:11.180881 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:11.180862 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:11.320801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.320723 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" event={"ID":"ccfa5733-74cd-4083-833e-376f6fc796e2","Type":"ContainerStarted","Data":"cd43356f583c0663d7fe9d88e8a0b0057f320fca8d1404afb4cc8fc0e8f77583"} Apr 24 19:07:11.369393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:11.369196 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" podStartSLOduration=7.519676676 podStartE2EDuration="24.369179595s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.74525056 +0000 UTC m=+3.069494702" lastFinishedPulling="2026-04-24 19:07:06.594753478 +0000 UTC m=+19.918997621" observedRunningTime="2026-04-24 19:07:11.365824255 +0000 UTC m=+24.690068418" watchObservedRunningTime="2026-04-24 19:07:11.369179595 +0000 UTC m=+24.693423757" Apr 24 19:07:12.323559 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.323524 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="0fbd2e14df51438bb5f38186e1de5360e29c84f882c359277449c829af5d8b6a" exitCode=0 Apr 24 19:07:12.324060 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.323614 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"0fbd2e14df51438bb5f38186e1de5360e29c84f882c359277449c829af5d8b6a"} Apr 24 19:07:12.324782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.324175 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:12.324782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.324227 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:12.324782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.324238 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:12.339466 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.339442 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:12.339602 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:12.339508 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:13.180834 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.180702 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:13.180834 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.180744 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:13.180834 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.180744 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:13.181093 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.180852 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:13.181093 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.181051 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:13.181304 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.181135 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:13.193050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.193018 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mdmw5"] Apr 24 19:07:13.195846 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.195819 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-v6hxv"] Apr 24 19:07:13.199754 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.199731 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p2bz2"] Apr 24 19:07:13.327502 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.327470 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="9fd68681f0324e33e1f87e50a125c6ad20cd5b194a0f06d5d5716c7947772c92" exitCode=0 Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.327583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"9fd68681f0324e33e1f87e50a125c6ad20cd5b194a0f06d5d5716c7947772c92"} Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.327609 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.327633 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:13.327692 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.327713 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.327821 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:13.328028 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:13.327845 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:14.333514 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:14.333482 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="960e6e5b700f07ca6727d741c6ae949f76b12e3b28ebdd34fcb9db19bae18e9f" exitCode=0 Apr 24 19:07:14.333910 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:14.333584 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"960e6e5b700f07ca6727d741c6ae949f76b12e3b28ebdd34fcb9db19bae18e9f"} Apr 24 19:07:15.090887 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:15.090853 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:15.091064 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:15.091022 2564 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:15.091110 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:15.091100 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret podName:62836674-92b3-4b2c-a4c9-e6896f0ff8fa nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.091080516 +0000 UTC m=+36.415324660 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret") pod "global-pull-secret-syncer-v6hxv" (UID: "62836674-92b3-4b2c-a4c9-e6896f0ff8fa") : object "kube-system"/"original-pull-secret" not registered Apr 24 19:07:15.180583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:15.180528 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:15.180583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:15.180567 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:15.180854 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:15.180696 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:15.180854 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:15.180808 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:15.180973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:15.180874 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:15.180973 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:15.180962 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:17.181640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:17.181410 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:17.182105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:17.181487 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:17.182105 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:17.181738 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:17.182105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:17.181514 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:17.182105 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:17.181833 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:17.182105 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:17.181875 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:19.180684 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.180639 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:19.181450 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.180639 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:19.181450 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.180814 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:07:19.181450 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.180639 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:19.181450 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.180868 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-v6hxv" podUID="62836674-92b3-4b2c-a4c9-e6896f0ff8fa" Apr 24 19:07:19.181450 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.180923 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-mdmw5" podUID="f7682155-10ef-40a7-9a0c-cef3315bdd30" Apr 24 19:07:19.553498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.553470 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-124.ec2.internal" event="NodeReady" Apr 24 19:07:19.553681 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.553627 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 19:07:19.586609 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.586575 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:07:19.609270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.609236 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr"] Apr 24 19:07:19.609436 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.609406 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.611953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.611929 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 19:07:19.611953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.611935 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 19:07:19.612115 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.611967 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 19:07:19.612454 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.612440 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kn9zh\"" Apr 24 19:07:19.616707 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.616360 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 19:07:19.630213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.630192 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr"] Apr 24 19:07:19.630213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.630215 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:07:19.630370 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.630226 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hqq8t"] Apr 24 19:07:19.630370 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.630331 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.632777 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.632749 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 24 19:07:19.632917 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.632816 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 24 19:07:19.632917 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.632898 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7kwb2\"" Apr 24 19:07:19.647732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.647704 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpqz"] Apr 24 19:07:19.647842 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.647826 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.649914 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.649878 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:07:19.650036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.649942 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 19:07:19.650105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.650059 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 19:07:19.666031 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.666011 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpqz"] Apr 24 19:07:19.666031 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.666034 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqq8t"] Apr 24 19:07:19.666175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.666126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:19.668671 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.668651 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 19:07:19.668821 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.668702 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 19:07:19.668821 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.668655 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:07:19.668821 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.668711 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 19:07:19.725266 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/567b5df7-6cf3-459e-a1cc-68aa56346b42-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725279 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72cbc52d-39db-4924-9a8f-438bf75f9a50-tmp-dir\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kq2w2\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725355 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725370 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725399 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725429 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72cbc52d-39db-4924-9a8f-438bf75f9a50-config-volume\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725447 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725466 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k4zr\" (UniqueName: \"kubernetes.io/projected/72cbc52d-39db-4924-9a8f-438bf75f9a50-kube-api-access-8k4zr\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725483 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725506 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725529 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.725644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.725545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.826600 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826496 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72cbc52d-39db-4924-9a8f-438bf75f9a50-tmp-dir\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.826600 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826535 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kq2w2\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.826600 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826574 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826607 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826657 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826695 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826726 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72cbc52d-39db-4924-9a8f-438bf75f9a50-config-volume\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826753 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.826774 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.826793 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:19.826852 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.826844 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.326824253 +0000 UTC m=+33.651068408 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.826872 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.326855582 +0000 UTC m=+33.651099742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826903 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72cbc52d-39db-4924-9a8f-438bf75f9a50-tmp-dir\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.826960 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfrdx\" (UniqueName: \"kubernetes.io/projected/ab654b1c-fb83-4468-9595-2d19444f6f70-kube-api-access-vfrdx\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827009 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8k4zr\" (UniqueName: \"kubernetes.io/projected/72cbc52d-39db-4924-9a8f-438bf75f9a50-kube-api-access-8k4zr\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827035 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827060 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827090 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827119 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827171 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/567b5df7-6cf3-459e-a1cc-68aa56346b42-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.827255 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827088 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.827727 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.827315 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:19.827727 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.827329 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:19.827727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827357 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/72cbc52d-39db-4924-9a8f-438bf75f9a50-config-volume\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.827727 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.827395 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.327383248 +0000 UTC m=+33.651627390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:19.827727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.827594 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.830939 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.830912 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.831068 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.831050 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.836778 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.836749 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/567b5df7-6cf3-459e-a1cc-68aa56346b42-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:19.836961 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.836945 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.837494 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.837468 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k4zr\" (UniqueName: \"kubernetes.io/projected/72cbc52d-39db-4924-9a8f-438bf75f9a50-kube-api-access-8k4zr\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:19.837691 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.837675 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.838051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.838036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kq2w2\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:19.927795 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.927762 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfrdx\" (UniqueName: \"kubernetes.io/projected/ab654b1c-fb83-4468-9595-2d19444f6f70-kube-api-access-vfrdx\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:19.927982 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.927851 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:19.927982 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.927936 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:19.928083 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:19.928001 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:20.427984886 +0000 UTC m=+33.752229028 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:19.936211 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:19.936188 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfrdx\" (UniqueName: \"kubernetes.io/projected/ab654b1c-fb83-4468-9595-2d19444f6f70-kube-api-access-vfrdx\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:20.330636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.330546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:20.330636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.330616 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.330646 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330660 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330733 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:21.330713106 +0000 UTC m=+34.654957260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330743 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330752 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330765 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330801 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:21.330789007 +0000 UTC m=+34.655033165 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:20.331327 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.330817 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:21.330808775 +0000 UTC m=+34.655052916 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:20.350202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.350176 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerStarted","Data":"448a67c0378c81898408252a62fba1d9b4de5cba8971e7155240882dafa6d76c"} Apr 24 19:07:20.432203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.432167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:20.432372 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.432295 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:20.432372 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.432350 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:21.432337224 +0000 UTC m=+34.756581369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:20.835687 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.835662 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:20.835843 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.835823 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:20.835912 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.835902 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:07:52.835882184 +0000 UTC m=+66.160126346 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 19:07:20.936836 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:20.936799 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:20.936995 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.936955 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 19:07:20.936995 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.936977 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 19:07:20.936995 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.936987 2564 projected.go:194] Error preparing data for projected volume kube-api-access-rtwff for pod openshift-network-diagnostics/network-check-target-mdmw5: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:20.937102 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:20.937038 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff podName:f7682155-10ef-40a7-9a0c-cef3315bdd30 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:52.937023804 +0000 UTC m=+66.261267944 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rtwff" (UniqueName: "kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff") pod "network-check-target-mdmw5" (UID: "f7682155-10ef-40a7-9a0c-cef3315bdd30") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 19:07:21.180585 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.180478 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:21.180585 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.180514 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:21.180585 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.180522 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:21.183314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.183284 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:21.183455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.183331 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:21.184672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.184651 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:07:21.184780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.184714 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:21.184780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.184723 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-72mrf\"" Apr 24 19:07:21.184885 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.184718 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 19:07:21.340595 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.340530 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340709 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340729 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.340729 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.340761 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340798 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.340774629 +0000 UTC m=+36.665018785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340844 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340864 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340895 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.340878253 +0000 UTC m=+36.665122396 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:21.341125 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.340912 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.340902492 +0000 UTC m=+36.665146632 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:21.353970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.353944 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="448a67c0378c81898408252a62fba1d9b4de5cba8971e7155240882dafa6d76c" exitCode=0 Apr 24 19:07:21.354105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.353992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"448a67c0378c81898408252a62fba1d9b4de5cba8971e7155240882dafa6d76c"} Apr 24 19:07:21.441793 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:21.441667 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:21.441793 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.441753 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:21.442008 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:21.441831 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:23.441807752 +0000 UTC m=+36.766051915 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:22.357969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:22.357934 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d" containerID="d6f24fc220e9b6e0cbfe08655b64399be4093cf39a2686912028eb5b52e1d9e9" exitCode=0 Apr 24 19:07:22.358333 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:22.357976 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerDied","Data":"d6f24fc220e9b6e0cbfe08655b64399be4093cf39a2686912028eb5b52e1d9e9"} Apr 24 19:07:23.156407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.156318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:23.159247 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.159215 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/62836674-92b3-4b2c-a4c9-e6896f0ff8fa-original-pull-secret\") pod \"global-pull-secret-syncer-v6hxv\" (UID: \"62836674-92b3-4b2c-a4c9-e6896f0ff8fa\") " pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:23.297094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.297047 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-v6hxv" Apr 24 19:07:23.357702 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.357664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:23.357840 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.357715 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:23.357840 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.357772 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:23.357840 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357815 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:23.357950 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357864 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:23.357950 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357872 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:23.357950 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357890 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:23.357950 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357883 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:27.357866883 +0000 UTC m=+40.682111024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:23.357950 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357949 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:27.357935414 +0000 UTC m=+40.682179560 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:23.358345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.357960 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:27.357954772 +0000 UTC m=+40.682198913 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:23.362083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.362060 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" event={"ID":"e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d","Type":"ContainerStarted","Data":"8b999990e762e13b4e38217253ebe98a4d74f277f96b265d0dfa1fd49d498730"} Apr 24 19:07:23.388149 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.388103 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-rb2v6" podStartSLOduration=6.087225979 podStartE2EDuration="36.388088802s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:06:49.751680003 +0000 UTC m=+3.075924148" lastFinishedPulling="2026-04-24 19:07:20.052542831 +0000 UTC m=+33.376786971" observedRunningTime="2026-04-24 19:07:23.386669222 +0000 UTC m=+36.710913396" watchObservedRunningTime="2026-04-24 19:07:23.388088802 +0000 UTC m=+36.712332966" Apr 24 19:07:23.459082 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.459047 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:23.459229 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.459216 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:23.459290 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:23.459281 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:27.45926581 +0000 UTC m=+40.783509951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:23.474016 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:23.473986 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-v6hxv"] Apr 24 19:07:23.479260 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:07:23.479226 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62836674_92b3_4b2c_a4c9_e6896f0ff8fa.slice/crio-bef1d0fa3d92c8961f4b30c6e0584fee5c781ae8d3b76efa00930d4c40ca096c WatchSource:0}: Error finding container bef1d0fa3d92c8961f4b30c6e0584fee5c781ae8d3b76efa00930d4c40ca096c: Status 404 returned error can't find the container with id bef1d0fa3d92c8961f4b30c6e0584fee5c781ae8d3b76efa00930d4c40ca096c Apr 24 19:07:24.365148 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:24.364940 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-v6hxv" event={"ID":"62836674-92b3-4b2c-a4c9-e6896f0ff8fa","Type":"ContainerStarted","Data":"bef1d0fa3d92c8961f4b30c6e0584fee5c781ae8d3b76efa00930d4c40ca096c"} Apr 24 19:07:27.395093 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:27.395056 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:27.395117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:27.395201 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395209 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395209 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395277 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:35.395256201 +0000 UTC m=+48.719500352 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395282 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395302 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395318 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:35.395307435 +0000 UTC m=+48.719551578 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:27.395571 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.395355 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:35.395340048 +0000 UTC m=+48.719584192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:27.495913 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:27.495879 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:27.496077 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.496017 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:27.496131 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:27.496095 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:35.496075085 +0000 UTC m=+48.820319227 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:28.374599 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:28.374563 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-v6hxv" event={"ID":"62836674-92b3-4b2c-a4c9-e6896f0ff8fa","Type":"ContainerStarted","Data":"6ca5385dccb0fe60b3f15953207934502b989f611e57d2dbd65039a6caa1d7a1"} Apr 24 19:07:28.390680 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:28.390634 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-v6hxv" podStartSLOduration=17.468655279 podStartE2EDuration="21.390618844s" podCreationTimestamp="2026-04-24 19:07:07 +0000 UTC" firstStartedPulling="2026-04-24 19:07:23.481086687 +0000 UTC m=+36.805330827" lastFinishedPulling="2026-04-24 19:07:27.40305024 +0000 UTC m=+40.727294392" observedRunningTime="2026-04-24 19:07:28.390183545 +0000 UTC m=+41.714427709" watchObservedRunningTime="2026-04-24 19:07:28.390618844 +0000 UTC m=+41.714863060" Apr 24 19:07:33.001806 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.001774 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n"] Apr 24 19:07:33.037982 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.037944 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv"] Apr 24 19:07:33.038141 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.038026 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.040801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.040775 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-rx7kq\"" Apr 24 19:07:33.040801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.040789 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 19:07:33.041010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.040788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 19:07:33.041010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.040870 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 19:07:33.046194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.046173 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 19:07:33.053372 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.053353 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr"] Apr 24 19:07:33.053485 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.053463 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.056064 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.056043 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 19:07:33.056298 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.056282 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 19:07:33.056360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.056346 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 19:07:33.056420 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.056371 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 19:07:33.077077 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.077052 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n"] Apr 24 19:07:33.077077 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.077076 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv"] Apr 24 19:07:33.077230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.077088 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr"] Apr 24 19:07:33.077230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.077199 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.079422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.079405 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 19:07:33.139040 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.139008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.139209 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.139066 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cndkh\" (UniqueName: \"kubernetes.io/projected/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-kube-api-access-cndkh\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.239871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.239833 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.240038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.239878 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.240038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.239944 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d19ce43a-56ae-4f56-b4a4-4b8543affb86-tmp\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.240121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240046 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6257d846-2381-45dc-8742-f277854e8058-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.240121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240072 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cndkh\" (UniqueName: \"kubernetes.io/projected/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-kube-api-access-cndkh\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.240121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240109 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98tn7\" (UniqueName: \"kubernetes.io/projected/6257d846-2381-45dc-8742-f277854e8058-kube-api-access-98tn7\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.240230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240153 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.240230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn4bf\" (UniqueName: \"kubernetes.io/projected/d19ce43a-56ae-4f56-b4a4-4b8543affb86-kube-api-access-kn4bf\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.240314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240258 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.240314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240283 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d19ce43a-56ae-4f56-b4a4-4b8543affb86-klusterlet-config\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.240407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.240320 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.246901 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.246865 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.248769 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.248747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cndkh\" (UniqueName: \"kubernetes.io/projected/e0f7170b-c6d3-43d4-8ee1-f08b891f76d7-kube-api-access-cndkh\") pod \"managed-serviceaccount-addon-agent-77f6bc48d7-db22n\" (UID: \"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.340772 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340687 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98tn7\" (UniqueName: \"kubernetes.io/projected/6257d846-2381-45dc-8742-f277854e8058-kube-api-access-98tn7\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.340772 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340736 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.340772 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340759 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kn4bf\" (UniqueName: \"kubernetes.io/projected/d19ce43a-56ae-4f56-b4a4-4b8543affb86-kube-api-access-kn4bf\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.341038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d19ce43a-56ae-4f56-b4a4-4b8543affb86-klusterlet-config\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.341038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340834 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.341038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.341038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340896 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.341038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.340920 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d19ce43a-56ae-4f56-b4a4-4b8543affb86-tmp\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.341270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.341145 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6257d846-2381-45dc-8742-f277854e8058-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.341376 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.341347 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d19ce43a-56ae-4f56-b4a4-4b8543affb86-tmp\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.341935 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.341908 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/6257d846-2381-45dc-8742-f277854e8058-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.343381 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.343348 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.343504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.343433 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-ca\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.343504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.343474 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d19ce43a-56ae-4f56-b4a4-4b8543affb86-klusterlet-config\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.343763 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.343743 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.343955 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.343934 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6257d846-2381-45dc-8742-f277854e8058-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.348889 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.348862 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98tn7\" (UniqueName: \"kubernetes.io/projected/6257d846-2381-45dc-8742-f277854e8058-kube-api-access-98tn7\") pod \"cluster-proxy-proxy-agent-5b5989f66-gg7kv\" (UID: \"6257d846-2381-45dc-8742-f277854e8058\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.348982 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.348894 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn4bf\" (UniqueName: \"kubernetes.io/projected/d19ce43a-56ae-4f56-b4a4-4b8543affb86-kube-api-access-kn4bf\") pod \"klusterlet-addon-workmgr-dd886988c-4srtr\" (UID: \"d19ce43a-56ae-4f56-b4a4-4b8543affb86\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.354710 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.354692 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" Apr 24 19:07:33.361416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.361394 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:07:33.399695 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.399648 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:33.782935 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.782906 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n"] Apr 24 19:07:33.786126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.786092 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr"] Apr 24 19:07:33.786870 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:33.786851 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv"] Apr 24 19:07:34.387016 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:34.386958 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" event={"ID":"d19ce43a-56ae-4f56-b4a4-4b8543affb86","Type":"ContainerStarted","Data":"216b978236a1ffa78f189f7be4b751d12a2b569d6acf464cc9632faf22a64ebe"} Apr 24 19:07:34.388441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:34.388413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerStarted","Data":"e5c2baed175a80e60cbdc98e6a09f60a2d025e025bfa045a1883706186a97ec8"} Apr 24 19:07:34.389495 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:34.389474 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" event={"ID":"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7","Type":"ContainerStarted","Data":"777bbbb35355c2cf7cd77849f28e3977e04167f985b4736c81cc393bf562d190"} Apr 24 19:07:35.461850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:35.461633 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:35.461850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:35.461686 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:35.461850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:35.461716 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:35.461850 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.461830 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.461832 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.461909 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.461913 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.461896 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:51.46187768 +0000 UTC m=+64.786121826 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.462008 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:51.461989265 +0000 UTC m=+64.786233409 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:35.462391 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.462027 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:51.462018545 +0000 UTC m=+64.786262686 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:35.562478 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:35.562440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:35.562698 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.562655 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:35.562776 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:35.562731 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:07:51.562709571 +0000 UTC m=+64.886953730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:38.399773 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:38.399731 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerStarted","Data":"4168e076d876f97e1af660c33c20d7c73ce842de465c8f7bab9f785d02ee49a4"} Apr 24 19:07:38.401284 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:38.401256 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" event={"ID":"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7","Type":"ContainerStarted","Data":"5338695e2e2d9e90a7bc7073b907f5a18bcd2b1f8e69dcbf9769fc8c725e65c3"} Apr 24 19:07:39.404848 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:39.404808 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" event={"ID":"d19ce43a-56ae-4f56-b4a4-4b8543affb86","Type":"ContainerStarted","Data":"ea9ad16cded2650b8dd588cdbbfbfedca9705829fa36ce2fb7ff5335584c3158"} Apr 24 19:07:39.423278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:39.423218 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" podStartSLOduration=3.552846804 podStartE2EDuration="7.423199031s" podCreationTimestamp="2026-04-24 19:07:32 +0000 UTC" firstStartedPulling="2026-04-24 19:07:33.79795937 +0000 UTC m=+47.122203512" lastFinishedPulling="2026-04-24 19:07:37.668311581 +0000 UTC m=+50.992555739" observedRunningTime="2026-04-24 19:07:38.416060826 +0000 UTC m=+51.740304988" watchObservedRunningTime="2026-04-24 19:07:39.423199031 +0000 UTC m=+52.747443195" Apr 24 19:07:39.424178 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:39.424136 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" podStartSLOduration=2.289163322 podStartE2EDuration="7.424120083s" podCreationTimestamp="2026-04-24 19:07:32 +0000 UTC" firstStartedPulling="2026-04-24 19:07:33.799032483 +0000 UTC m=+47.123276628" lastFinishedPulling="2026-04-24 19:07:38.933989234 +0000 UTC m=+52.258233389" observedRunningTime="2026-04-24 19:07:39.42299034 +0000 UTC m=+52.747234506" watchObservedRunningTime="2026-04-24 19:07:39.424120083 +0000 UTC m=+52.748364250" Apr 24 19:07:40.407222 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:40.407179 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:40.409235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:40.409210 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:07:41.411102 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:41.411065 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerStarted","Data":"68b12cde808417f49f08ea346f5a111a80ca22b0d43d8cf38eb143167b65fdba"} Apr 24 19:07:41.411102 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:41.411104 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerStarted","Data":"f131929bfc7631140c62836cc6d2cddf91749ac7b0ea6b2a2d9747e6f7001039"} Apr 24 19:07:41.431051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:41.431005 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" podStartSLOduration=2.631910156 podStartE2EDuration="9.430987297s" podCreationTimestamp="2026-04-24 19:07:32 +0000 UTC" firstStartedPulling="2026-04-24 19:07:33.798519264 +0000 UTC m=+47.122763417" lastFinishedPulling="2026-04-24 19:07:40.597596403 +0000 UTC m=+53.921840558" observedRunningTime="2026-04-24 19:07:41.429440484 +0000 UTC m=+54.753684647" watchObservedRunningTime="2026-04-24 19:07:41.430987297 +0000 UTC m=+54.755231459" Apr 24 19:07:44.343803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:44.343775 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-d6t5k" Apr 24 19:07:51.489918 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:51.489874 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:51.489931 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490023 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490028 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490049 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:51.490033 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490093 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490099 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:23.490080202 +0000 UTC m=+96.814324348 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490146 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:23.490135249 +0000 UTC m=+96.814379395 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:07:51.490345 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.490161 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:23.490153509 +0000 UTC m=+96.814397650 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:07:51.590875 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:51.590841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:07:51.591016 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.590954 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:07:51.591016 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:51.591006 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:08:23.590993952 +0000 UTC m=+96.915238093 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:07:52.900662 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:52.900628 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:07:52.903247 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:52.903227 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 19:07:52.911081 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:52.911061 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:07:52.911143 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:07:52.911114 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:08:56.911099442 +0000 UTC m=+130.235343583 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : secret "metrics-daemon-secret" not found Apr 24 19:07:53.001019 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.000986 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:53.003903 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.003883 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 19:07:53.014467 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.014445 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 19:07:53.024857 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.024823 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtwff\" (UniqueName: \"kubernetes.io/projected/f7682155-10ef-40a7-9a0c-cef3315bdd30-kube-api-access-rtwff\") pod \"network-check-target-mdmw5\" (UID: \"f7682155-10ef-40a7-9a0c-cef3315bdd30\") " pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:53.302622 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.302596 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-72mrf\"" Apr 24 19:07:53.310792 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.310766 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:07:53.425345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.425315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-mdmw5"] Apr 24 19:07:53.428228 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:07:53.428194 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7682155_10ef_40a7_9a0c_cef3315bdd30.slice/crio-7df16b4561712d6f0f8e8b5754a34106217f839264bb3a052b4b7cec54640cae WatchSource:0}: Error finding container 7df16b4561712d6f0f8e8b5754a34106217f839264bb3a052b4b7cec54640cae: Status 404 returned error can't find the container with id 7df16b4561712d6f0f8e8b5754a34106217f839264bb3a052b4b7cec54640cae Apr 24 19:07:53.435810 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:53.435784 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mdmw5" event={"ID":"f7682155-10ef-40a7-9a0c-cef3315bdd30","Type":"ContainerStarted","Data":"7df16b4561712d6f0f8e8b5754a34106217f839264bb3a052b4b7cec54640cae"} Apr 24 19:07:56.445288 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:56.445254 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-mdmw5" event={"ID":"f7682155-10ef-40a7-9a0c-cef3315bdd30","Type":"ContainerStarted","Data":"5fd1a68a517ef8e3ff4c0c9e46977ca835eaad408569f40c90413edbcb989021"} Apr 24 19:07:56.445702 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:07:56.445364 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:08:23.553220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:23.553182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:08:23.553220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:23.553223 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553318 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:23.553340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553394 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls podName:72cbc52d-39db-4924-9a8f-438bf75f9a50 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:27.55338078 +0000 UTC m=+160.877624921 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls") pod "dns-default-hqq8t" (UID: "72cbc52d-39db-4924-9a8f-438bf75f9a50") : secret "dns-default-metrics-tls" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553435 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553443 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553456 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-c54c7b55f-r457g: secret "image-registry-tls" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553485 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:27.553473498 +0000 UTC m=+160.877717638 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:08:23.553779 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.553496 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls podName:11fadb13-0cb8-4aea-9078-f23ecd9c91a7 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:27.553490796 +0000 UTC m=+160.877734937 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls") pod "image-registry-c54c7b55f-r457g" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7") : secret "image-registry-tls" not found Apr 24 19:08:23.654393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:23.654354 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:08:23.654579 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.654523 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:08:23.654635 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:23.654615 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:09:27.654598799 +0000 UTC m=+160.978842945 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:08:27.450494 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:27.450458 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-mdmw5" Apr 24 19:08:27.465929 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:27.465875 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-mdmw5" podStartSLOduration=97.708972644 podStartE2EDuration="1m40.465860615s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:07:53.430092236 +0000 UTC m=+66.754336391" lastFinishedPulling="2026-04-24 19:07:56.186980213 +0000 UTC m=+69.511224362" observedRunningTime="2026-04-24 19:07:56.463872952 +0000 UTC m=+69.788117115" watchObservedRunningTime="2026-04-24 19:08:27.465860615 +0000 UTC m=+100.790104777" Apr 24 19:08:56.993662 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:08:56.993622 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:08:56.994249 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:56.993806 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 24 19:08:56.994249 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:08:56.993908 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs podName:9104b10b-fe94-4977-b556-addf9a7f232f nodeName:}" failed. No retries permitted until 2026-04-24 19:10:58.99388413 +0000 UTC m=+252.318128271 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs") pod "network-metrics-daemon-p2bz2" (UID: "9104b10b-fe94-4977-b556-addf9a7f232f") : secret "metrics-daemon-secret" not found Apr 24 19:09:20.956129 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:20.956101 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j4w5m_6d09a5ce-72ac-4dea-826a-c4a0beb36ce3/dns-node-resolver/0.log" Apr 24 19:09:21.954059 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:21.954032 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nbt8b_e51af979-d2ff-49ba-8e4e-7620a2a4cd7e/node-ca/0.log" Apr 24 19:09:22.620232 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:22.620187 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" Apr 24 19:09:22.639376 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:22.639343 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" podUID="567b5df7-6cf3-459e-a1cc-68aa56346b42" Apr 24 19:09:22.643103 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:22.643076 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:22.656594 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:22.656534 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-hqq8t" podUID="72cbc52d-39db-4924-9a8f-438bf75f9a50" Apr 24 19:09:22.673717 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:22.673689 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-jzpqz" podUID="ab654b1c-fb83-4468-9595-2d19444f6f70" Apr 24 19:09:23.644833 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:23.644797 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:24.190881 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:24.190835 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-p2bz2" podUID="9104b10b-fe94-4977-b556-addf9a7f232f" Apr 24 19:09:27.635263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.635222 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:27.635883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.635277 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:09:27.635883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.635317 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:27.635883 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:27.635455 2564 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 24 19:09:27.635883 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:27.635533 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert podName:567b5df7-6cf3-459e-a1cc-68aa56346b42 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:29.635515991 +0000 UTC m=+282.959760133 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-tfqzr" (UID: "567b5df7-6cf3-459e-a1cc-68aa56346b42") : secret "networking-console-plugin-cert" not found Apr 24 19:09:27.637626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.637604 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"image-registry-c54c7b55f-r457g\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:27.638299 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.638281 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/72cbc52d-39db-4924-9a8f-438bf75f9a50-metrics-tls\") pod \"dns-default-hqq8t\" (UID: \"72cbc52d-39db-4924-9a8f-438bf75f9a50\") " pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:27.736326 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.736285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:09:27.736513 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:27.736453 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 19:09:27.736613 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:09:27.736529 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert podName:ab654b1c-fb83-4468-9595-2d19444f6f70 nodeName:}" failed. No retries permitted until 2026-04-24 19:11:29.736508406 +0000 UTC m=+283.060752562 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert") pod "ingress-canary-jzpqz" (UID: "ab654b1c-fb83-4468-9595-2d19444f6f70") : secret "canary-serving-cert" not found Apr 24 19:09:27.746717 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.746691 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kn9zh\"" Apr 24 19:09:27.755057 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.755038 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:27.848143 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.848116 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-d7j6w\"" Apr 24 19:09:27.855688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.855658 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:27.884660 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.884592 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:09:27.889118 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:09:27.889086 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11fadb13_0cb8_4aea_9078_f23ecd9c91a7.slice/crio-32bccf8334d411fe85b09cae43cf6fd274b87c1b40d40f3a5d1f3f15935ba276 WatchSource:0}: Error finding container 32bccf8334d411fe85b09cae43cf6fd274b87c1b40d40f3a5d1f3f15935ba276: Status 404 returned error can't find the container with id 32bccf8334d411fe85b09cae43cf6fd274b87c1b40d40f3a5d1f3f15935ba276 Apr 24 19:09:27.983072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:27.983041 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hqq8t"] Apr 24 19:09:27.986967 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:09:27.986935 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72cbc52d_39db_4924_9a8f_438bf75f9a50.slice/crio-a9050ff0ae99120aa857dd4f1e5c7a14e1555cbea5e744c505b3a84cc29f1734 WatchSource:0}: Error finding container a9050ff0ae99120aa857dd4f1e5c7a14e1555cbea5e744c505b3a84cc29f1734: Status 404 returned error can't find the container with id a9050ff0ae99120aa857dd4f1e5c7a14e1555cbea5e744c505b3a84cc29f1734 Apr 24 19:09:28.657803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:28.657767 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqq8t" event={"ID":"72cbc52d-39db-4924-9a8f-438bf75f9a50","Type":"ContainerStarted","Data":"a9050ff0ae99120aa857dd4f1e5c7a14e1555cbea5e744c505b3a84cc29f1734"} Apr 24 19:09:28.659400 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:28.659369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" event={"ID":"11fadb13-0cb8-4aea-9078-f23ecd9c91a7","Type":"ContainerStarted","Data":"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81"} Apr 24 19:09:28.659523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:28.659407 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" event={"ID":"11fadb13-0cb8-4aea-9078-f23ecd9c91a7","Type":"ContainerStarted","Data":"32bccf8334d411fe85b09cae43cf6fd274b87c1b40d40f3a5d1f3f15935ba276"} Apr 24 19:09:28.659523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:28.659495 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:28.682863 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:28.682811 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" podStartSLOduration=161.682795259 podStartE2EDuration="2m41.682795259s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:09:28.681899324 +0000 UTC m=+162.006143486" watchObservedRunningTime="2026-04-24 19:09:28.682795259 +0000 UTC m=+162.007039421" Apr 24 19:09:29.664873 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:29.664840 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqq8t" event={"ID":"72cbc52d-39db-4924-9a8f-438bf75f9a50","Type":"ContainerStarted","Data":"3c043ec648ceac0df8c113f13b8023ceedce52cf94946ea161dbf3484d0e6523"} Apr 24 19:09:29.665267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:29.664882 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hqq8t" event={"ID":"72cbc52d-39db-4924-9a8f-438bf75f9a50","Type":"ContainerStarted","Data":"b15f8583857229fee6bd2d90f2fa2e695280394dc6843c36a6628a32b6eb199f"} Apr 24 19:09:29.665267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:29.665033 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:29.706206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:29.706154 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hqq8t" podStartSLOduration=129.268617299 podStartE2EDuration="2m10.706138928s" podCreationTimestamp="2026-04-24 19:07:19 +0000 UTC" firstStartedPulling="2026-04-24 19:09:27.98881375 +0000 UTC m=+161.313057891" lastFinishedPulling="2026-04-24 19:09:29.426335377 +0000 UTC m=+162.750579520" observedRunningTime="2026-04-24 19:09:29.705491933 +0000 UTC m=+163.029736109" watchObservedRunningTime="2026-04-24 19:09:29.706138928 +0000 UTC m=+163.030383090" Apr 24 19:09:34.179983 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:34.179948 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:09:38.180884 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:38.180787 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:09:38.687001 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:38.686964 2564 generic.go:358] "Generic (PLEG): container finished" podID="e0f7170b-c6d3-43d4-8ee1-f08b891f76d7" containerID="5338695e2e2d9e90a7bc7073b907f5a18bcd2b1f8e69dcbf9769fc8c725e65c3" exitCode=255 Apr 24 19:09:38.687174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:38.687016 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" event={"ID":"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7","Type":"ContainerDied","Data":"5338695e2e2d9e90a7bc7073b907f5a18bcd2b1f8e69dcbf9769fc8c725e65c3"} Apr 24 19:09:38.687318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:38.687305 2564 scope.go:117] "RemoveContainer" containerID="5338695e2e2d9e90a7bc7073b907f5a18bcd2b1f8e69dcbf9769fc8c725e65c3" Apr 24 19:09:39.180402 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.180366 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:09:39.668749 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.668720 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hqq8t" Apr 24 19:09:39.691605 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.691541 2564 generic.go:358] "Generic (PLEG): container finished" podID="d19ce43a-56ae-4f56-b4a4-4b8543affb86" containerID="ea9ad16cded2650b8dd588cdbbfbfedca9705829fa36ce2fb7ff5335584c3158" exitCode=1 Apr 24 19:09:39.691780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.691626 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" event={"ID":"d19ce43a-56ae-4f56-b4a4-4b8543affb86","Type":"ContainerDied","Data":"ea9ad16cded2650b8dd588cdbbfbfedca9705829fa36ce2fb7ff5335584c3158"} Apr 24 19:09:39.692076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.692012 2564 scope.go:117] "RemoveContainer" containerID="ea9ad16cded2650b8dd588cdbbfbfedca9705829fa36ce2fb7ff5335584c3158" Apr 24 19:09:39.693330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:39.693297 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-77f6bc48d7-db22n" event={"ID":"e0f7170b-c6d3-43d4-8ee1-f08b891f76d7","Type":"ContainerStarted","Data":"70ef289fbc88e02b4edf6671930db408184cbdcf42f23dd4564d0f07a3040e74"} Apr 24 19:09:40.407223 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:40.407184 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:09:40.697949 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:40.697918 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" event={"ID":"d19ce43a-56ae-4f56-b4a4-4b8543affb86","Type":"ContainerStarted","Data":"3a12f3baf086af23189b46e38fd3b35275aaf0508255f6bc47d54e9fd19715c2"} Apr 24 19:09:40.698318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:40.698114 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:09:40.698710 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:40.698694 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-dd886988c-4srtr" Apr 24 19:09:46.365537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.365501 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jr97d"] Apr 24 19:09:46.369946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.369927 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.373219 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.373198 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 19:09:46.374014 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.373996 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 19:09:46.374136 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.374071 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 19:09:46.374208 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.374150 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-fsgkc\"" Apr 24 19:09:46.374208 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.374200 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 19:09:46.393927 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.393893 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-crio-socket\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.394041 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.393934 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.394041 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.393955 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.394041 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.393997 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-data-volume\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.394159 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.394057 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jmh\" (UniqueName: \"kubernetes.io/projected/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-api-access-44jmh\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.401439 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.401417 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jr97d"] Apr 24 19:09:46.494318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494341 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-data-volume\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494425 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44jmh\" (UniqueName: \"kubernetes.io/projected/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-api-access-44jmh\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494469 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-crio-socket\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494630 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494572 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-crio-socket\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494838 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494812 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-data-volume\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.494995 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.494976 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.496610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.496591 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.511322 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.511294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jmh\" (UniqueName: \"kubernetes.io/projected/f52eb3dd-0025-4b92-8240-ccd8cabc06b4-kube-api-access-44jmh\") pod \"insights-runtime-extractor-jr97d\" (UID: \"f52eb3dd-0025-4b92-8240-ccd8cabc06b4\") " pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.679459 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.679373 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jr97d" Apr 24 19:09:46.803839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:46.803807 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jr97d"] Apr 24 19:09:46.807290 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:09:46.807260 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf52eb3dd_0025_4b92_8240_ccd8cabc06b4.slice/crio-69aa3a30c61b2bbc76c2d59a439c547f52ff58a4fe6f42c1bd72462ae187fd25 WatchSource:0}: Error finding container 69aa3a30c61b2bbc76c2d59a439c547f52ff58a4fe6f42c1bd72462ae187fd25: Status 404 returned error can't find the container with id 69aa3a30c61b2bbc76c2d59a439c547f52ff58a4fe6f42c1bd72462ae187fd25 Apr 24 19:09:47.715692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:47.715656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jr97d" event={"ID":"f52eb3dd-0025-4b92-8240-ccd8cabc06b4","Type":"ContainerStarted","Data":"b18ab9bf9478856f4cd6cb7d10b9ef44fec3bfa87076bab56cb6d481414fc476"} Apr 24 19:09:47.715692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:47.715695 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jr97d" event={"ID":"f52eb3dd-0025-4b92-8240-ccd8cabc06b4","Type":"ContainerStarted","Data":"d6b8cf3c5921d961ca2fec3d5997e48d1c2a3f981d1cf29c0796568893420b42"} Apr 24 19:09:47.716094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:47.715705 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jr97d" event={"ID":"f52eb3dd-0025-4b92-8240-ccd8cabc06b4","Type":"ContainerStarted","Data":"69aa3a30c61b2bbc76c2d59a439c547f52ff58a4fe6f42c1bd72462ae187fd25"} Apr 24 19:09:47.759048 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:47.759003 2564 patch_prober.go:28] interesting pod/image-registry-c54c7b55f-r457g container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 19:09:47.759218 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:47.759072 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:09:49.669534 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:49.669504 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:09:49.722523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:49.722487 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jr97d" event={"ID":"f52eb3dd-0025-4b92-8240-ccd8cabc06b4","Type":"ContainerStarted","Data":"b793a879b1381dcef0881e012680b78c8a7d0919c416f7ce78810818701d8344"} Apr 24 19:09:49.759493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:49.759447 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jr97d" podStartSLOduration=1.6795917519999999 podStartE2EDuration="3.759431447s" podCreationTimestamp="2026-04-24 19:09:46 +0000 UTC" firstStartedPulling="2026-04-24 19:09:46.865207739 +0000 UTC m=+180.189451884" lastFinishedPulling="2026-04-24 19:09:48.945047438 +0000 UTC m=+182.269291579" observedRunningTime="2026-04-24 19:09:49.758723298 +0000 UTC m=+183.082967462" watchObservedRunningTime="2026-04-24 19:09:49.759431447 +0000 UTC m=+183.083675617" Apr 24 19:09:56.454837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.454776 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7jbh4"] Apr 24 19:09:56.458189 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.458166 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.462254 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.462218 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 19:09:56.462404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.462254 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 19:09:56.462404 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.462296 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 19:09:56.463304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.463282 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 19:09:56.463423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.463294 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 19:09:56.463423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.463343 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 19:09:56.464709 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.464627 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-xnczg\"" Apr 24 19:09:56.568443 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568406 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-textfile\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568443 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568442 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkksx\" (UniqueName: \"kubernetes.io/projected/493effb9-7158-4628-b742-23d621fdfb64-kube-api-access-wkksx\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568480 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-sys\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568512 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-root\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568529 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-tls\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568630 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-wtmp\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568648 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-metrics-client-ca\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.568845 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.568667 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.669908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.669859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-sys\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.669932 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-root\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.669952 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.669970 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-sys\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.669980 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-tls\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670033 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-root\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670054 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-wtmp\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670084 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-metrics-client-ca\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670120 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670164 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-textfile\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670187 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wkksx\" (UniqueName: \"kubernetes.io/projected/493effb9-7158-4628-b742-23d621fdfb64-kube-api-access-wkksx\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-wtmp\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670507 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670472 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-textfile\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670775 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670752 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-metrics-client-ca\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.670855 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.670775 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-accelerators-collector-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.672310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.672290 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-tls\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.672422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.672407 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/493effb9-7158-4628-b742-23d621fdfb64-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.679073 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.679051 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkksx\" (UniqueName: \"kubernetes.io/projected/493effb9-7158-4628-b742-23d621fdfb64-kube-api-access-wkksx\") pod \"node-exporter-7jbh4\" (UID: \"493effb9-7158-4628-b742-23d621fdfb64\") " pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.767043 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:56.766955 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7jbh4" Apr 24 19:09:56.774903 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:09:56.774872 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod493effb9_7158_4628_b742_23d621fdfb64.slice/crio-115fdf8c20e359feaee079baf55601c68aa914856f87e8eeb8702d4666ba2dc3 WatchSource:0}: Error finding container 115fdf8c20e359feaee079baf55601c68aa914856f87e8eeb8702d4666ba2dc3: Status 404 returned error can't find the container with id 115fdf8c20e359feaee079baf55601c68aa914856f87e8eeb8702d4666ba2dc3 Apr 24 19:09:57.746259 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:57.746226 2564 generic.go:358] "Generic (PLEG): container finished" podID="493effb9-7158-4628-b742-23d621fdfb64" containerID="feffaacec9c2325a430db0b556f298c2923d9606e4df71fc2dbcfe738de5d632" exitCode=0 Apr 24 19:09:57.746648 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:57.746303 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jbh4" event={"ID":"493effb9-7158-4628-b742-23d621fdfb64","Type":"ContainerDied","Data":"feffaacec9c2325a430db0b556f298c2923d9606e4df71fc2dbcfe738de5d632"} Apr 24 19:09:57.746648 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:57.746345 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jbh4" event={"ID":"493effb9-7158-4628-b742-23d621fdfb64","Type":"ContainerStarted","Data":"115fdf8c20e359feaee079baf55601c68aa914856f87e8eeb8702d4666ba2dc3"} Apr 24 19:09:58.750294 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:58.750264 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jbh4" event={"ID":"493effb9-7158-4628-b742-23d621fdfb64","Type":"ContainerStarted","Data":"081701642e9fbf043a4955039b71b27405aed4b13ebfbb32c0ac27f4f68579cb"} Apr 24 19:09:58.750681 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:58.750298 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7jbh4" event={"ID":"493effb9-7158-4628-b742-23d621fdfb64","Type":"ContainerStarted","Data":"2a126051d0e4015cba3b065679710a5ea205f2e4492b906cc6f49b5e2942b545"} Apr 24 19:09:58.770804 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:09:58.770755 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7jbh4" podStartSLOduration=2.002936009 podStartE2EDuration="2.770739336s" podCreationTimestamp="2026-04-24 19:09:56 +0000 UTC" firstStartedPulling="2026-04-24 19:09:56.776624384 +0000 UTC m=+190.100868524" lastFinishedPulling="2026-04-24 19:09:57.54442771 +0000 UTC m=+190.868671851" observedRunningTime="2026-04-24 19:09:58.770157508 +0000 UTC m=+192.094401666" watchObservedRunningTime="2026-04-24 19:09:58.770739336 +0000 UTC m=+192.094983499" Apr 24 19:10:08.579116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:08.579082 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:10:23.362818 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:23.362776 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" podUID="6257d846-2381-45dc-8742-f277854e8058" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:10:33.363426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.363386 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" podUID="6257d846-2381-45dc-8742-f277854e8058" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:10:33.597814 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.597770 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerName="registry" containerID="cri-o://8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81" gracePeriod=30 Apr 24 19:10:33.837279 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.837256 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:10:33.842989 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.842958 2564 generic.go:358] "Generic (PLEG): container finished" podID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerID="8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81" exitCode=0 Apr 24 19:10:33.843116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.843021 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" Apr 24 19:10:33.843116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.843035 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" event={"ID":"11fadb13-0cb8-4aea-9078-f23ecd9c91a7","Type":"ContainerDied","Data":"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81"} Apr 24 19:10:33.843116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.843072 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-c54c7b55f-r457g" event={"ID":"11fadb13-0cb8-4aea-9078-f23ecd9c91a7","Type":"ContainerDied","Data":"32bccf8334d411fe85b09cae43cf6fd274b87c1b40d40f3a5d1f3f15935ba276"} Apr 24 19:10:33.843116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.843089 2564 scope.go:117] "RemoveContainer" containerID="8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81" Apr 24 19:10:33.850371 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.850351 2564 scope.go:117] "RemoveContainer" containerID="8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81" Apr 24 19:10:33.850665 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:10:33.850648 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81\": container with ID starting with 8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81 not found: ID does not exist" containerID="8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81" Apr 24 19:10:33.850718 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.850677 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81"} err="failed to get container status \"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81\": rpc error: code = NotFound desc = could not find container \"8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81\": container with ID starting with 8a08d057f68023ffd71c8a0f7a141c86a86ee51eb4fbcc6f4840efb05965ed81 not found: ID does not exist" Apr 24 19:10:33.967797 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967690 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.967797 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967742 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.967797 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967772 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967806 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967834 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967884 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967926 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kq2w2\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.967958 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token\") pod \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\" (UID: \"11fadb13-0cb8-4aea-9078-f23ecd9c91a7\") " Apr 24 19:10:33.968318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.968286 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:33.968457 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.968424 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:10:33.970435 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.970381 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:33.970435 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.970388 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:33.970645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.970537 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2" (OuterVolumeSpecName: "kube-api-access-kq2w2") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "kube-api-access-kq2w2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:10:33.970645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.970570 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:33.970750 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.970648 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:10:33.976862 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:33.976824 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "11fadb13-0cb8-4aea-9078-f23ecd9c91a7" (UID: "11fadb13-0cb8-4aea-9078-f23ecd9c91a7"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:10:34.068523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068467 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-certificates\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068512 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-ca-trust-extracted\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068523 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-image-registry-private-configuration\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068534 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-trusted-ca\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068523 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068544 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kq2w2\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-kube-api-access-kq2w2\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068853 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068580 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-bound-sa-token\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068853 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068594 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-registry-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.068853 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.068625 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/11fadb13-0cb8-4aea-9078-f23ecd9c91a7-installation-pull-secrets\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:10:34.164989 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.164957 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:10:34.171144 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:34.171113 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-c54c7b55f-r457g"] Apr 24 19:10:35.184174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:35.184132 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" path="/var/lib/kubelet/pods/11fadb13-0cb8-4aea-9078-f23ecd9c91a7/volumes" Apr 24 19:10:43.362936 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.362893 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" podUID="6257d846-2381-45dc-8742-f277854e8058" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 19:10:43.363383 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.362962 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" Apr 24 19:10:43.363442 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.363424 2564 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"68b12cde808417f49f08ea346f5a111a80ca22b0d43d8cf38eb143167b65fdba"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 19:10:43.363481 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.363463 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" podUID="6257d846-2381-45dc-8742-f277854e8058" containerName="service-proxy" containerID="cri-o://68b12cde808417f49f08ea346f5a111a80ca22b0d43d8cf38eb143167b65fdba" gracePeriod=30 Apr 24 19:10:43.871685 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.871651 2564 generic.go:358] "Generic (PLEG): container finished" podID="6257d846-2381-45dc-8742-f277854e8058" containerID="68b12cde808417f49f08ea346f5a111a80ca22b0d43d8cf38eb143167b65fdba" exitCode=2 Apr 24 19:10:43.871869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.871714 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerDied","Data":"68b12cde808417f49f08ea346f5a111a80ca22b0d43d8cf38eb143167b65fdba"} Apr 24 19:10:43.871869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:43.871749 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5989f66-gg7kv" event={"ID":"6257d846-2381-45dc-8742-f277854e8058","Type":"ContainerStarted","Data":"415e410deec28fa4110ac8c0c2b8c972e29e4f18e352135c48f2dbf5bdee737c"} Apr 24 19:10:59.051802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.051757 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:10:59.053953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.053925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9104b10b-fe94-4977-b556-addf9a7f232f-metrics-certs\") pod \"network-metrics-daemon-p2bz2\" (UID: \"9104b10b-fe94-4977-b556-addf9a7f232f\") " pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:10:59.284280 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.284246 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-h6pgv\"" Apr 24 19:10:59.292451 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.292421 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-p2bz2" Apr 24 19:10:59.411061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.411028 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-p2bz2"] Apr 24 19:10:59.414768 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:10:59.414738 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9104b10b_fe94_4977_b556_addf9a7f232f.slice/crio-28e039e9405674f2f94f97bd540881363c9545eb52e9e2fe38031ea18c827960 WatchSource:0}: Error finding container 28e039e9405674f2f94f97bd540881363c9545eb52e9e2fe38031ea18c827960: Status 404 returned error can't find the container with id 28e039e9405674f2f94f97bd540881363c9545eb52e9e2fe38031ea18c827960 Apr 24 19:10:59.912973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:10:59.912927 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2bz2" event={"ID":"9104b10b-fe94-4977-b556-addf9a7f232f","Type":"ContainerStarted","Data":"28e039e9405674f2f94f97bd540881363c9545eb52e9e2fe38031ea18c827960"} Apr 24 19:11:00.917432 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:00.917338 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2bz2" event={"ID":"9104b10b-fe94-4977-b556-addf9a7f232f","Type":"ContainerStarted","Data":"aaf183129450214d011de1b5ed6eb2449fe42ef49f33a520fd0859936d99fb2c"} Apr 24 19:11:00.917432 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:00.917383 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-p2bz2" event={"ID":"9104b10b-fe94-4977-b556-addf9a7f232f","Type":"ContainerStarted","Data":"452e7cd6844d787c5ef58b6a84dc902f23dbf8b285d9920e9378f04186b8a730"} Apr 24 19:11:00.939104 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:00.939053 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-p2bz2" podStartSLOduration=252.876529157 podStartE2EDuration="4m13.939035978s" podCreationTimestamp="2026-04-24 19:06:47 +0000 UTC" firstStartedPulling="2026-04-24 19:10:59.416533902 +0000 UTC m=+252.740778046" lastFinishedPulling="2026-04-24 19:11:00.479040712 +0000 UTC m=+253.803284867" observedRunningTime="2026-04-24 19:11:00.936517888 +0000 UTC m=+254.260762051" watchObservedRunningTime="2026-04-24 19:11:00.939035978 +0000 UTC m=+254.263280140" Apr 24 19:11:29.676926 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.676889 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:11:29.679358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.679332 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/567b5df7-6cf3-459e-a1cc-68aa56346b42-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-tfqzr\" (UID: \"567b5df7-6cf3-459e-a1cc-68aa56346b42\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:11:29.683497 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.683467 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-7kwb2\"" Apr 24 19:11:29.691620 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.691598 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" Apr 24 19:11:29.777325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.777294 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:11:29.779696 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.779669 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ab654b1c-fb83-4468-9595-2d19444f6f70-cert\") pod \"ingress-canary-jzpqz\" (UID: \"ab654b1c-fb83-4468-9595-2d19444f6f70\") " pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:11:29.783950 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.783917 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-tkcsj\"" Apr 24 19:11:29.791807 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.791775 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-jzpqz" Apr 24 19:11:29.834465 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.834434 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr"] Apr 24 19:11:29.838270 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:11:29.838227 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod567b5df7_6cf3_459e_a1cc_68aa56346b42.slice/crio-b7c32f791434082756d47d7490c3155caf4e9f808080ef7a58801e01ec66acd2 WatchSource:0}: Error finding container b7c32f791434082756d47d7490c3155caf4e9f808080ef7a58801e01ec66acd2: Status 404 returned error can't find the container with id b7c32f791434082756d47d7490c3155caf4e9f808080ef7a58801e01ec66acd2 Apr 24 19:11:29.911816 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.911786 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-jzpqz"] Apr 24 19:11:29.914738 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:11:29.914706 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab654b1c_fb83_4468_9595_2d19444f6f70.slice/crio-510e9bed47e4eeaa982bab83e8cb3bb98a9896a3c18019ad6db28eb22c7fe81a WatchSource:0}: Error finding container 510e9bed47e4eeaa982bab83e8cb3bb98a9896a3c18019ad6db28eb22c7fe81a: Status 404 returned error can't find the container with id 510e9bed47e4eeaa982bab83e8cb3bb98a9896a3c18019ad6db28eb22c7fe81a Apr 24 19:11:29.992518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.992481 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jzpqz" event={"ID":"ab654b1c-fb83-4468-9595-2d19444f6f70","Type":"ContainerStarted","Data":"510e9bed47e4eeaa982bab83e8cb3bb98a9896a3c18019ad6db28eb22c7fe81a"} Apr 24 19:11:29.993416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:29.993394 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" event={"ID":"567b5df7-6cf3-459e-a1cc-68aa56346b42","Type":"ContainerStarted","Data":"b7c32f791434082756d47d7490c3155caf4e9f808080ef7a58801e01ec66acd2"} Apr 24 19:11:30.997475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:30.997426 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" event={"ID":"567b5df7-6cf3-459e-a1cc-68aa56346b42","Type":"ContainerStarted","Data":"25da1334e44ff0ec7f488559eea26880a3aabd6f410620166a7cd874c7e40e4e"} Apr 24 19:11:31.015290 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:31.015227 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-tfqzr" podStartSLOduration=272.010653215 podStartE2EDuration="4m33.015207984s" podCreationTimestamp="2026-04-24 19:06:58 +0000 UTC" firstStartedPulling="2026-04-24 19:11:29.840855857 +0000 UTC m=+283.165099999" lastFinishedPulling="2026-04-24 19:11:30.845410624 +0000 UTC m=+284.169654768" observedRunningTime="2026-04-24 19:11:31.013694283 +0000 UTC m=+284.337938458" watchObservedRunningTime="2026-04-24 19:11:31.015207984 +0000 UTC m=+284.339452148" Apr 24 19:11:32.001487 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:32.001446 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-jzpqz" event={"ID":"ab654b1c-fb83-4468-9595-2d19444f6f70","Type":"ContainerStarted","Data":"5f6d8077c7567fb450743911f6f228d9819cde6161a809357ca6161cf85a4883"} Apr 24 19:11:32.018879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:11:32.018826 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-jzpqz" podStartSLOduration=251.094140374 podStartE2EDuration="4m13.018809376s" podCreationTimestamp="2026-04-24 19:07:19 +0000 UTC" firstStartedPulling="2026-04-24 19:11:29.916520863 +0000 UTC m=+283.240765004" lastFinishedPulling="2026-04-24 19:11:31.841189857 +0000 UTC m=+285.165434006" observedRunningTime="2026-04-24 19:11:32.017492785 +0000 UTC m=+285.341736951" watchObservedRunningTime="2026-04-24 19:11:32.018809376 +0000 UTC m=+285.343053539" Apr 24 19:13:57.786075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.786042 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-pjqz7"] Apr 24 19:13:57.786524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.786294 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerName="registry" Apr 24 19:13:57.786524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.786305 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerName="registry" Apr 24 19:13:57.786524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.786355 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="11fadb13-0cb8-4aea-9078-f23ecd9c91a7" containerName="registry" Apr 24 19:13:57.789024 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.789006 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:57.791363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.791332 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 24 19:13:57.791488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.791424 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 24 19:13:57.791488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.791436 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 24 19:13:57.792273 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.792258 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-n2965\"" Apr 24 19:13:57.792363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.792303 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 24 19:13:57.796423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.796400 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pjqz7"] Apr 24 19:13:57.908496 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.908441 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-certificates\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:57.908706 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:57.908514 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msnnt\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-kube-api-access-msnnt\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.009272 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.009234 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-certificates\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.009467 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.009292 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msnnt\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-kube-api-access-msnnt\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.011686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.011664 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-certificates\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.021175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.021146 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msnnt\" (UniqueName: \"kubernetes.io/projected/760e7d37-2023-4fc6-af6b-b727d6511c75-kube-api-access-msnnt\") pod \"keda-admission-cf49989db-pjqz7\" (UID: \"760e7d37-2023-4fc6-af6b-b727d6511c75\") " pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.100078 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.099982 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:13:58.221383 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.221345 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-pjqz7"] Apr 24 19:13:58.226246 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:13:58.226217 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760e7d37_2023_4fc6_af6b_b727d6511c75.slice/crio-55fd2eebf02fca7f7c2a33720310d946f198a1255829f693235515306c9b357c WatchSource:0}: Error finding container 55fd2eebf02fca7f7c2a33720310d946f198a1255829f693235515306c9b357c: Status 404 returned error can't find the container with id 55fd2eebf02fca7f7c2a33720310d946f198a1255829f693235515306c9b357c Apr 24 19:13:58.227520 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.227504 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:13:58.361631 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:13:58.361526 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pjqz7" event={"ID":"760e7d37-2023-4fc6-af6b-b727d6511c75","Type":"ContainerStarted","Data":"55fd2eebf02fca7f7c2a33720310d946f198a1255829f693235515306c9b357c"} Apr 24 19:14:01.371208 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:14:01.371173 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-pjqz7" event={"ID":"760e7d37-2023-4fc6-af6b-b727d6511c75","Type":"ContainerStarted","Data":"71c4bcfd168247bb638da876e2842ea8c095cc2b9942144bd76eb17969e616b1"} Apr 24 19:14:01.371605 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:14:01.371293 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:14:01.387971 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:14:01.387915 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-pjqz7" podStartSLOduration=2.094334244 podStartE2EDuration="4.387896655s" podCreationTimestamp="2026-04-24 19:13:57 +0000 UTC" firstStartedPulling="2026-04-24 19:13:58.227671927 +0000 UTC m=+431.551916067" lastFinishedPulling="2026-04-24 19:14:00.521234325 +0000 UTC m=+433.845478478" observedRunningTime="2026-04-24 19:14:01.386840687 +0000 UTC m=+434.711084851" watchObservedRunningTime="2026-04-24 19:14:01.387896655 +0000 UTC m=+434.712140819" Apr 24 19:14:22.376605 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:14:22.376513 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-pjqz7" Apr 24 19:15:03.918828 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.918794 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-79f457656b-dnvmz"] Apr 24 19:15:03.922282 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.922261 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:03.926907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.926886 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 19:15:03.928629 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.928614 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 19:15:03.943095 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.943068 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-dnd96\"" Apr 24 19:15:03.948781 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.948752 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 19:15:03.949935 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.949907 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-79f457656b-dnvmz"] Apr 24 19:15:03.954232 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.954208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf8cm\" (UniqueName: \"kubernetes.io/projected/244773f8-4954-457f-b86b-6d39c169449e-kube-api-access-qf8cm\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:03.954347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:03.954264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/244773f8-4954-457f-b86b-6d39c169449e-data\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.055300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.055261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf8cm\" (UniqueName: \"kubernetes.io/projected/244773f8-4954-457f-b86b-6d39c169449e-kube-api-access-qf8cm\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.055474 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.055335 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/244773f8-4954-457f-b86b-6d39c169449e-data\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.055706 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.055691 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/244773f8-4954-457f-b86b-6d39c169449e-data\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.064830 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.064807 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf8cm\" (UniqueName: \"kubernetes.io/projected/244773f8-4954-457f-b86b-6d39c169449e-kube-api-access-qf8cm\") pod \"seaweedfs-79f457656b-dnvmz\" (UID: \"244773f8-4954-457f-b86b-6d39c169449e\") " pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.233239 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.233196 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:04.352412 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.352383 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-79f457656b-dnvmz"] Apr 24 19:15:04.355074 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:15:04.355043 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244773f8_4954_457f_b86b_6d39c169449e.slice/crio-f9f94f60c4a29a7e4fdcdafd5574a32079509d91c6ec9b1889288288ad601a5e WatchSource:0}: Error finding container f9f94f60c4a29a7e4fdcdafd5574a32079509d91c6ec9b1889288288ad601a5e: Status 404 returned error can't find the container with id f9f94f60c4a29a7e4fdcdafd5574a32079509d91c6ec9b1889288288ad601a5e Apr 24 19:15:04.530434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:04.530342 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-79f457656b-dnvmz" event={"ID":"244773f8-4954-457f-b86b-6d39c169449e","Type":"ContainerStarted","Data":"f9f94f60c4a29a7e4fdcdafd5574a32079509d91c6ec9b1889288288ad601a5e"} Apr 24 19:15:08.544436 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:08.544397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-79f457656b-dnvmz" event={"ID":"244773f8-4954-457f-b86b-6d39c169449e","Type":"ContainerStarted","Data":"ebecbe9ebf447dfcb0c216eda4b47042448c05cdae8eef79e2416c601e7e4224"} Apr 24 19:15:08.544844 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:08.544659 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:15:08.564484 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:08.564431 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-79f457656b-dnvmz" podStartSLOduration=2.190604904 podStartE2EDuration="5.564415148s" podCreationTimestamp="2026-04-24 19:15:03 +0000 UTC" firstStartedPulling="2026-04-24 19:15:04.356273259 +0000 UTC m=+497.680517401" lastFinishedPulling="2026-04-24 19:15:07.730083504 +0000 UTC m=+501.054327645" observedRunningTime="2026-04-24 19:15:08.562783794 +0000 UTC m=+501.887027957" watchObservedRunningTime="2026-04-24 19:15:08.564415148 +0000 UTC m=+501.888659309" Apr 24 19:15:14.550092 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:15:14.550060 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-79f457656b-dnvmz" Apr 24 19:16:13.869208 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.869105 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-lw4fp"] Apr 24 19:16:13.872331 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.872306 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:13.875371 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.875345 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 24 19:16:13.876458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.876439 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-5xwzt\"" Apr 24 19:16:13.886979 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.886950 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-sjh7l"] Apr 24 19:16:13.889936 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.889913 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:13.891094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.891068 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lw4fp"] Apr 24 19:16:13.892768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.892745 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 24 19:16:13.892954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.892929 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rlkqg\"" Apr 24 19:16:13.905994 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.905966 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-sjh7l"] Apr 24 19:16:13.952678 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.952639 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31c81dac-86cd-4bd5-95c9-6378736961c3-tls-certs\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:13.952878 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:13.952693 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979b9\" (UniqueName: \"kubernetes.io/projected/31c81dac-86cd-4bd5-95c9-6378736961c3-kube-api-access-979b9\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.053078 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.053033 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31c81dac-86cd-4bd5-95c9-6378736961c3-tls-certs\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.053267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.053107 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-979b9\" (UniqueName: \"kubernetes.io/projected/31c81dac-86cd-4bd5-95c9-6378736961c3-kube-api-access-979b9\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.053267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.053145 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssngp\" (UniqueName: \"kubernetes.io/projected/b8b105d2-a24a-49da-a5f4-94c41212b27c-kube-api-access-ssngp\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.053267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.053176 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b105d2-a24a-49da-a5f4-94c41212b27c-cert\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.055484 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.055462 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/31c81dac-86cd-4bd5-95c9-6378736961c3-tls-certs\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.066244 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.066202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-979b9\" (UniqueName: \"kubernetes.io/projected/31c81dac-86cd-4bd5-95c9-6378736961c3-kube-api-access-979b9\") pod \"model-serving-api-86f7b4b499-lw4fp\" (UID: \"31c81dac-86cd-4bd5-95c9-6378736961c3\") " pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.154185 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.154089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssngp\" (UniqueName: \"kubernetes.io/projected/b8b105d2-a24a-49da-a5f4-94c41212b27c-kube-api-access-ssngp\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.154185 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.154134 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b105d2-a24a-49da-a5f4-94c41212b27c-cert\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.156510 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.156488 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b8b105d2-a24a-49da-a5f4-94c41212b27c-cert\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.163294 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.163270 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssngp\" (UniqueName: \"kubernetes.io/projected/b8b105d2-a24a-49da-a5f4-94c41212b27c-kube-api-access-ssngp\") pod \"odh-model-controller-696fc77849-sjh7l\" (UID: \"b8b105d2-a24a-49da-a5f4-94c41212b27c\") " pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.181715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.181672 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:14.200642 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.200608 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:14.321247 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.321211 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lw4fp"] Apr 24 19:16:14.324245 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:16:14.324214 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31c81dac_86cd_4bd5_95c9_6378736961c3.slice/crio-fa6a4728a326cc1e7c1a9931bfbc93b75a8e637200d01fe85738b0637b7d387a WatchSource:0}: Error finding container fa6a4728a326cc1e7c1a9931bfbc93b75a8e637200d01fe85738b0637b7d387a: Status 404 returned error can't find the container with id fa6a4728a326cc1e7c1a9931bfbc93b75a8e637200d01fe85738b0637b7d387a Apr 24 19:16:14.339912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.339884 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-sjh7l"] Apr 24 19:16:14.343045 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:16:14.343014 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8b105d2_a24a_49da_a5f4_94c41212b27c.slice/crio-359d56a6c2db97e3683a1ce6632b988c7d8cbe5d619c0f3037018e66db8e470d WatchSource:0}: Error finding container 359d56a6c2db97e3683a1ce6632b988c7d8cbe5d619c0f3037018e66db8e470d: Status 404 returned error can't find the container with id 359d56a6c2db97e3683a1ce6632b988c7d8cbe5d619c0f3037018e66db8e470d Apr 24 19:16:14.712517 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.712475 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-sjh7l" event={"ID":"b8b105d2-a24a-49da-a5f4-94c41212b27c","Type":"ContainerStarted","Data":"359d56a6c2db97e3683a1ce6632b988c7d8cbe5d619c0f3037018e66db8e470d"} Apr 24 19:16:14.713443 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:14.713421 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lw4fp" event={"ID":"31c81dac-86cd-4bd5-95c9-6378736961c3","Type":"ContainerStarted","Data":"fa6a4728a326cc1e7c1a9931bfbc93b75a8e637200d01fe85738b0637b7d387a"} Apr 24 19:16:18.726786 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.726743 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-sjh7l" event={"ID":"b8b105d2-a24a-49da-a5f4-94c41212b27c","Type":"ContainerStarted","Data":"11d99791ab0d76ce666b72fb72d71100f4e5512fad9675a3ae9a732e1cbc0789"} Apr 24 19:16:18.727264 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.726984 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:18.728239 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.728205 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lw4fp" event={"ID":"31c81dac-86cd-4bd5-95c9-6378736961c3","Type":"ContainerStarted","Data":"c5f994d57c25b5523ab66c1413f3b903522565997665e4a90b602f3687f42071"} Apr 24 19:16:18.728373 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.728352 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:18.745992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.745922 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-sjh7l" podStartSLOduration=2.07704229 podStartE2EDuration="5.745908036s" podCreationTimestamp="2026-04-24 19:16:13 +0000 UTC" firstStartedPulling="2026-04-24 19:16:14.344257777 +0000 UTC m=+567.668501918" lastFinishedPulling="2026-04-24 19:16:18.013123509 +0000 UTC m=+571.337367664" observedRunningTime="2026-04-24 19:16:18.744919011 +0000 UTC m=+572.069163165" watchObservedRunningTime="2026-04-24 19:16:18.745908036 +0000 UTC m=+572.070152199" Apr 24 19:16:18.765884 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:18.765831 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-lw4fp" podStartSLOduration=2.083291836 podStartE2EDuration="5.765810985s" podCreationTimestamp="2026-04-24 19:16:13 +0000 UTC" firstStartedPulling="2026-04-24 19:16:14.325952141 +0000 UTC m=+567.650196283" lastFinishedPulling="2026-04-24 19:16:18.008471284 +0000 UTC m=+571.332715432" observedRunningTime="2026-04-24 19:16:18.76426397 +0000 UTC m=+572.088508134" watchObservedRunningTime="2026-04-24 19:16:18.765810985 +0000 UTC m=+572.090055165" Apr 24 19:16:29.733637 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:29.733608 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-sjh7l" Apr 24 19:16:29.735644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:29.735620 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-lw4fp" Apr 24 19:16:41.405284 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.405243 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:41.408466 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.408441 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.410823 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.410802 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 19:16:41.415837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.415801 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:41.554760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.554716 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx24h\" (UniqueName: \"kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.554760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.554758 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.655163 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.655111 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kx24h\" (UniqueName: \"kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.655163 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.655170 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.655617 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.655544 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.663103 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.663079 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kx24h\" (UniqueName: \"kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h\") pod \"seaweedfs-tls-custom-ddd4dbfd-p7lgs\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.717676 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.717636 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:41.833930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:41.833893 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:41.836690 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:16:41.836660 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e684c7c_8554_4a13_bc45_eaaec01e04ee.slice/crio-c04051e8d43174ed886f25882b00c528eb2f01c2aa25f393ca07784b57342bc2 WatchSource:0}: Error finding container c04051e8d43174ed886f25882b00c528eb2f01c2aa25f393ca07784b57342bc2: Status 404 returned error can't find the container with id c04051e8d43174ed886f25882b00c528eb2f01c2aa25f393ca07784b57342bc2 Apr 24 19:16:42.790573 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:42.790514 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" event={"ID":"0e684c7c-8554-4a13-bc45-eaaec01e04ee","Type":"ContainerStarted","Data":"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2"} Apr 24 19:16:42.790573 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:42.790566 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" event={"ID":"0e684c7c-8554-4a13-bc45-eaaec01e04ee","Type":"ContainerStarted","Data":"c04051e8d43174ed886f25882b00c528eb2f01c2aa25f393ca07784b57342bc2"} Apr 24 19:16:42.806346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:42.806299 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" podStartSLOduration=1.53393204 podStartE2EDuration="1.806285709s" podCreationTimestamp="2026-04-24 19:16:41 +0000 UTC" firstStartedPulling="2026-04-24 19:16:41.837979199 +0000 UTC m=+595.162223341" lastFinishedPulling="2026-04-24 19:16:42.110332856 +0000 UTC m=+595.434577010" observedRunningTime="2026-04-24 19:16:42.805483746 +0000 UTC m=+596.129727908" watchObservedRunningTime="2026-04-24 19:16:42.806285709 +0000 UTC m=+596.130529888" Apr 24 19:16:44.334127 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:44.334094 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:44.796390 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:44.796354 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" podUID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" containerName="seaweedfs-tls-custom" containerID="cri-o://22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2" gracePeriod=30 Apr 24 19:16:46.035774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.035750 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:46.186765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.186669 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kx24h\" (UniqueName: \"kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h\") pod \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " Apr 24 19:16:46.186765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.186721 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data\") pod \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\" (UID: \"0e684c7c-8554-4a13-bc45-eaaec01e04ee\") " Apr 24 19:16:46.188039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.188007 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data" (OuterVolumeSpecName: "data") pod "0e684c7c-8554-4a13-bc45-eaaec01e04ee" (UID: "0e684c7c-8554-4a13-bc45-eaaec01e04ee"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:16:46.188879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.188860 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h" (OuterVolumeSpecName: "kube-api-access-kx24h") pod "0e684c7c-8554-4a13-bc45-eaaec01e04ee" (UID: "0e684c7c-8554-4a13-bc45-eaaec01e04ee"). InnerVolumeSpecName "kube-api-access-kx24h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:16:46.287607 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.287571 2564 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/0e684c7c-8554-4a13-bc45-eaaec01e04ee-data\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:16:46.287607 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.287601 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kx24h\" (UniqueName: \"kubernetes.io/projected/0e684c7c-8554-4a13-bc45-eaaec01e04ee-kube-api-access-kx24h\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:16:46.802809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.802771 2564 generic.go:358] "Generic (PLEG): container finished" podID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" containerID="22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2" exitCode=0 Apr 24 19:16:46.802998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.802859 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" Apr 24 19:16:46.802998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.802870 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" event={"ID":"0e684c7c-8554-4a13-bc45-eaaec01e04ee","Type":"ContainerDied","Data":"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2"} Apr 24 19:16:46.802998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.802915 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs" event={"ID":"0e684c7c-8554-4a13-bc45-eaaec01e04ee","Type":"ContainerDied","Data":"c04051e8d43174ed886f25882b00c528eb2f01c2aa25f393ca07784b57342bc2"} Apr 24 19:16:46.802998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.802936 2564 scope.go:117] "RemoveContainer" containerID="22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2" Apr 24 19:16:46.811953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.811932 2564 scope.go:117] "RemoveContainer" containerID="22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2" Apr 24 19:16:46.812218 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:16:46.812198 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2\": container with ID starting with 22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2 not found: ID does not exist" containerID="22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2" Apr 24 19:16:46.812261 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.812228 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2"} err="failed to get container status \"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2\": rpc error: code = NotFound desc = could not find container \"22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2\": container with ID starting with 22afaceac1244dbf958b8cb10cea8ba24ab6b16bb1e460b6882c91af2c4ecde2 not found: ID does not exist" Apr 24 19:16:46.822693 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.822667 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:46.824312 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.824290 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p7lgs"] Apr 24 19:16:46.850760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.850733 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f"] Apr 24 19:16:46.851023 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.851011 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" containerName="seaweedfs-tls-custom" Apr 24 19:16:46.851066 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.851024 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" containerName="seaweedfs-tls-custom" Apr 24 19:16:46.851099 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.851070 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" containerName="seaweedfs-tls-custom" Apr 24 19:16:46.853722 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.853705 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:46.856019 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.855999 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 19:16:46.856123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.856104 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 19:16:46.859357 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.859331 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f"] Apr 24 19:16:46.992331 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.992287 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:46.992498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.992341 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmc6x\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-kube-api-access-mmc6x\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:46.992498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:46.992424 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f676d40b-e099-4abd-952c-a9a1e24357f2-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.093040 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.092948 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmc6x\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-kube-api-access-mmc6x\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.093040 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.093004 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f676d40b-e099-4abd-952c-a9a1e24357f2-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.093493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.093048 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.093493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.093417 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/f676d40b-e099-4abd-952c-a9a1e24357f2-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.097800 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.097778 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 24 19:16:47.105539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.105510 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.113698 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.113042 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmc6x\" (UniqueName: \"kubernetes.io/projected/f676d40b-e099-4abd-952c-a9a1e24357f2-kube-api-access-mmc6x\") pod \"seaweedfs-tls-custom-5c88b85bb7-2h62f\" (UID: \"f676d40b-e099-4abd-952c-a9a1e24357f2\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.163183 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.163147 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" Apr 24 19:16:47.186277 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.186240 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e684c7c-8554-4a13-bc45-eaaec01e04ee" path="/var/lib/kubelet/pods/0e684c7c-8554-4a13-bc45-eaaec01e04ee/volumes" Apr 24 19:16:47.292294 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.292254 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f"] Apr 24 19:16:47.295259 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:16:47.295221 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf676d40b_e099_4abd_952c_a9a1e24357f2.slice/crio-7ea5688fb722a768b0b63fdc22c87ca6856645b1232becad89ec2d7568fea6ec WatchSource:0}: Error finding container 7ea5688fb722a768b0b63fdc22c87ca6856645b1232becad89ec2d7568fea6ec: Status 404 returned error can't find the container with id 7ea5688fb722a768b0b63fdc22c87ca6856645b1232becad89ec2d7568fea6ec Apr 24 19:16:47.557062 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.557039 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 24 19:16:47.807224 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.807127 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" event={"ID":"f676d40b-e099-4abd-952c-a9a1e24357f2","Type":"ContainerStarted","Data":"24737017b98e14b6e78ed5f852034e8d2dec5f529b3087154f2780f5277a96d9"} Apr 24 19:16:47.807224 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.807163 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" event={"ID":"f676d40b-e099-4abd-952c-a9a1e24357f2","Type":"ContainerStarted","Data":"7ea5688fb722a768b0b63fdc22c87ca6856645b1232becad89ec2d7568fea6ec"} Apr 24 19:16:47.822399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:47.822353 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-2h62f" podStartSLOduration=1.564462475 podStartE2EDuration="1.822338665s" podCreationTimestamp="2026-04-24 19:16:46 +0000 UTC" firstStartedPulling="2026-04-24 19:16:47.296712031 +0000 UTC m=+600.620956186" lastFinishedPulling="2026-04-24 19:16:47.554588222 +0000 UTC m=+600.878832376" observedRunningTime="2026-04-24 19:16:47.821327175 +0000 UTC m=+601.145571338" watchObservedRunningTime="2026-04-24 19:16:47.822338665 +0000 UTC m=+601.146582828" Apr 24 19:16:56.279938 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.279905 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw"] Apr 24 19:16:56.283835 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.283816 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.286069 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.286043 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 24 19:16:56.286180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.286046 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 24 19:16:56.290096 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.290070 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw"] Apr 24 19:16:56.357644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.357609 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de019248-dfbd-4b16-8c85-d710c48b1922-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.357644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.357659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.357856 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.357681 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwwcs\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-kube-api-access-jwwcs\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.458925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.458881 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de019248-dfbd-4b16-8c85-d710c48b1922-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.459119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.458939 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.459119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.458968 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwwcs\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-kube-api-access-jwwcs\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.459119 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:16:56.459045 2564 projected.go:264] Couldn't get secret kserve/seaweedfs-tls-serving: secret "seaweedfs-tls-serving" not found Apr 24 19:16:56.459119 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:16:56.459068 2564 projected.go:194] Error preparing data for projected volume seaweedfs-tls-serving for pod kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw: secret "seaweedfs-tls-serving" not found Apr 24 19:16:56.459310 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:16:56.459148 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving podName:de019248-dfbd-4b16-8c85-d710c48b1922 nodeName:}" failed. No retries permitted until 2026-04-24 19:16:56.95912429 +0000 UTC m=+610.283368434 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "seaweedfs-tls-serving" (UniqueName: "kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving") pod "seaweedfs-tls-serving-7fd5766db9-6qfcw" (UID: "de019248-dfbd-4b16-8c85-d710c48b1922") : secret "seaweedfs-tls-serving" not found Apr 24 19:16:56.459310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.459291 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/de019248-dfbd-4b16-8c85-d710c48b1922-data\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.467624 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.467595 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwwcs\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-kube-api-access-jwwcs\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.961946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.961900 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:56.964318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:56.964285 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/de019248-dfbd-4b16-8c85-d710c48b1922-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-6qfcw\" (UID: \"de019248-dfbd-4b16-8c85-d710c48b1922\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:57.193794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:57.193762 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" Apr 24 19:16:57.313324 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:57.313286 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw"] Apr 24 19:16:57.316400 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:16:57.316367 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde019248_dfbd_4b16_8c85_d710c48b1922.slice/crio-369cc16e815ef1fd4ad1e4d569693f941ab0332cdd5714de9a4d900897ebddcf WatchSource:0}: Error finding container 369cc16e815ef1fd4ad1e4d569693f941ab0332cdd5714de9a4d900897ebddcf: Status 404 returned error can't find the container with id 369cc16e815ef1fd4ad1e4d569693f941ab0332cdd5714de9a4d900897ebddcf Apr 24 19:16:57.834798 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:57.834759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" event={"ID":"de019248-dfbd-4b16-8c85-d710c48b1922","Type":"ContainerStarted","Data":"e5b236c27c38b6d6dfbe4c00d29caad53aa67d47db61c2ba01b27b327c52a990"} Apr 24 19:16:57.834798 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:57.834798 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" event={"ID":"de019248-dfbd-4b16-8c85-d710c48b1922","Type":"ContainerStarted","Data":"369cc16e815ef1fd4ad1e4d569693f941ab0332cdd5714de9a4d900897ebddcf"} Apr 24 19:16:57.849530 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:16:57.849454 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-6qfcw" podStartSLOduration=1.536758094 podStartE2EDuration="1.849437136s" podCreationTimestamp="2026-04-24 19:16:56 +0000 UTC" firstStartedPulling="2026-04-24 19:16:57.317687043 +0000 UTC m=+610.641931185" lastFinishedPulling="2026-04-24 19:16:57.630366082 +0000 UTC m=+610.954610227" observedRunningTime="2026-04-24 19:16:57.848605545 +0000 UTC m=+611.172849708" watchObservedRunningTime="2026-04-24 19:16:57.849437136 +0000 UTC m=+611.173681299" Apr 24 19:17:15.657888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.657849 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:17:15.661718 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.661693 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.664203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.664178 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 24 19:17:15.664334 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.664243 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 19:17:15.664334 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.664244 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 19:17:15.664334 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.664244 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 24 19:17:15.665047 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.665033 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-27j79\"" Apr 24 19:17:15.671765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.671741 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:17:15.696115 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.696081 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.696115 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.696116 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.696300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.696191 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7ngz\" (UniqueName: \"kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.696300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.696250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.797518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.797470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m7ngz\" (UniqueName: \"kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.797724 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.797575 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.797767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.797720 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.797805 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.797769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.798194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.798163 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.798434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.798416 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.800076 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.800052 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.806755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.806724 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7ngz\" (UniqueName: \"kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz\") pod \"isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:15.973646 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:15.973605 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:16.109581 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:16.109524 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:17:16.112655 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:17:16.112620 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb832fca_6eae_401d_a27a_14e11ca6f81e.slice/crio-bd54d90b5e9bedcc4ac3ad189d2b737238c9efd0a2cbd17e41b3fef004a52c25 WatchSource:0}: Error finding container bd54d90b5e9bedcc4ac3ad189d2b737238c9efd0a2cbd17e41b3fef004a52c25: Status 404 returned error can't find the container with id bd54d90b5e9bedcc4ac3ad189d2b737238c9efd0a2cbd17e41b3fef004a52c25 Apr 24 19:17:16.883867 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:16.883820 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerStarted","Data":"bd54d90b5e9bedcc4ac3ad189d2b737238c9efd0a2cbd17e41b3fef004a52c25"} Apr 24 19:17:20.899020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:20.898982 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerStarted","Data":"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419"} Apr 24 19:17:23.909263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:23.909228 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerID="cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419" exitCode=0 Apr 24 19:17:23.909710 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:23.909299 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerDied","Data":"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419"} Apr 24 19:17:37.959618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:37.959571 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerStarted","Data":"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5"} Apr 24 19:17:39.967304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:39.967273 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerStarted","Data":"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d"} Apr 24 19:17:42.978565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:42.978523 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerStarted","Data":"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7"} Apr 24 19:17:42.979021 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:42.978708 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:42.997970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:42.997922 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podStartSLOduration=1.2901283669999999 podStartE2EDuration="27.997908142s" podCreationTimestamp="2026-04-24 19:17:15 +0000 UTC" firstStartedPulling="2026-04-24 19:17:16.114654422 +0000 UTC m=+629.438898563" lastFinishedPulling="2026-04-24 19:17:42.822434179 +0000 UTC m=+656.146678338" observedRunningTime="2026-04-24 19:17:42.996446223 +0000 UTC m=+656.320690407" watchObservedRunningTime="2026-04-24 19:17:42.997908142 +0000 UTC m=+656.322152306" Apr 24 19:17:43.982359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:43.982314 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:43.982359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:43.982363 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:43.983476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:43.983445 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:17:43.984032 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:43.984005 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:17:44.985922 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:44.985873 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:17:44.986381 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:44.986278 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:17:44.989421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:44.989400 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:17:45.989246 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:45.989201 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:17:45.989686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:45.989544 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:17:55.989433 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:55.989380 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:17:55.989959 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:17:55.989808 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:05.989791 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:05.989737 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:18:05.990335 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:05.990245 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:15.989142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:15.989089 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:18:15.989677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:15.989460 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:25.989822 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:25.989776 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:18:25.990362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:25.990224 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:35.989982 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:35.989896 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:18:35.990426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:35.990321 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:18:45.989756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:45.989726 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:18:45.990237 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:18:45.990041 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:00.686175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.686130 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:19:00.686722 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.686689 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" containerID="cri-o://f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5" gracePeriod=30 Apr 24 19:19:00.686799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.686717 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" containerID="cri-o://de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7" gracePeriod=30 Apr 24 19:19:00.686859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.686726 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" containerID="cri-o://8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d" gracePeriod=30 Apr 24 19:19:00.800245 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.800214 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:19:00.802656 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.802638 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:00.804941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.804911 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 24 19:19:00.805050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.804945 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 24 19:19:00.814644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.814620 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:19:00.935447 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.935413 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmk7n\" (UniqueName: \"kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:00.935626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.935468 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:00.935626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.935601 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:00.935752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:00.935659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.036728 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.036673 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.036938 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.036840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmk7n\" (UniqueName: \"kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.036938 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.036894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.037053 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.036942 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.037328 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.037308 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.037546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.037529 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.039176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.039155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.044921 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.044901 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmk7n\" (UniqueName: \"kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n\") pod \"isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.114952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.114891 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:01.205973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.205942 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerID="8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d" exitCode=2 Apr 24 19:19:01.206132 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.206010 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerDied","Data":"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d"} Apr 24 19:19:01.238123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.238090 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:19:01.240503 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:19:01.240476 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bf47579_704a_4869_a77a_44db3e58116f.slice/crio-588bbdcc4e7e99abf78617c1e7ed72cf614d8d445fc34e8ffdc283538f52e88e WatchSource:0}: Error finding container 588bbdcc4e7e99abf78617c1e7ed72cf614d8d445fc34e8ffdc283538f52e88e: Status 404 returned error can't find the container with id 588bbdcc4e7e99abf78617c1e7ed72cf614d8d445fc34e8ffdc283538f52e88e Apr 24 19:19:01.242307 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:01.242293 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:19:02.210521 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:02.210485 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerStarted","Data":"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650"} Apr 24 19:19:02.210521 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:02.210523 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerStarted","Data":"588bbdcc4e7e99abf78617c1e7ed72cf614d8d445fc34e8ffdc283538f52e88e"} Apr 24 19:19:04.986710 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:04.986663 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:05.220428 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.220389 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerID="f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5" exitCode=0 Apr 24 19:19:05.220619 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.220455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerDied","Data":"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5"} Apr 24 19:19:05.221747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.221725 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bf47579-704a-4869-a77a-44db3e58116f" containerID="cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650" exitCode=0 Apr 24 19:19:05.221871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.221758 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerDied","Data":"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650"} Apr 24 19:19:05.989524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.989473 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:19:05.990021 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:05.989786 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:06.226620 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226587 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerStarted","Data":"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38"} Apr 24 19:19:06.226620 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226625 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerStarted","Data":"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f"} Apr 24 19:19:06.226825 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226635 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerStarted","Data":"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6"} Apr 24 19:19:06.226974 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226947 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:06.226974 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226983 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:06.227153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.226996 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:06.228315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.228277 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:06.228979 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.228957 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:06.246844 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:06.246760 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podStartSLOduration=6.24674489 podStartE2EDuration="6.24674489s" podCreationTimestamp="2026-04-24 19:19:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:19:06.245804948 +0000 UTC m=+739.570049110" watchObservedRunningTime="2026-04-24 19:19:06.24674489 +0000 UTC m=+739.570989043" Apr 24 19:19:07.230054 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:07.230011 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:07.230521 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:07.230486 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:09.987007 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:09.986965 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:12.236134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:12.236100 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:19:12.236705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:12.236669 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:12.236925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:12.236897 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:14.986663 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:14.986618 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:14.987115 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:14.986769 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:15.990051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:15.990004 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:19:15.990493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:15.990365 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:19.986983 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:19.986931 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:22.238726 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:22.238682 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:22.239203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:22.239143 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:24.986868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:24.986823 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:25.990083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:25.990032 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 24 19:19:25.990610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:25.990196 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:25.990610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:25.990423 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:25.990610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:25.990569 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:29.986753 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:29.986705 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 24 19:19:30.833143 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.833119 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:30.860328 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860299 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls\") pod \"bb832fca-6eae-401d-a27a-14e11ca6f81e\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " Apr 24 19:19:30.860494 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860342 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m7ngz\" (UniqueName: \"kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz\") pod \"bb832fca-6eae-401d-a27a-14e11ca6f81e\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " Apr 24 19:19:30.860583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860524 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location\") pod \"bb832fca-6eae-401d-a27a-14e11ca6f81e\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " Apr 24 19:19:30.860647 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860615 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"bb832fca-6eae-401d-a27a-14e11ca6f81e\" (UID: \"bb832fca-6eae-401d-a27a-14e11ca6f81e\") " Apr 24 19:19:30.860869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860842 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb832fca-6eae-401d-a27a-14e11ca6f81e" (UID: "bb832fca-6eae-401d-a27a-14e11ca6f81e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:19:30.860956 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.860934 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "bb832fca-6eae-401d-a27a-14e11ca6f81e" (UID: "bb832fca-6eae-401d-a27a-14e11ca6f81e"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:19:30.862631 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.862603 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb832fca-6eae-401d-a27a-14e11ca6f81e" (UID: "bb832fca-6eae-401d-a27a-14e11ca6f81e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:19:30.862725 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.862654 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz" (OuterVolumeSpecName: "kube-api-access-m7ngz") pod "bb832fca-6eae-401d-a27a-14e11ca6f81e" (UID: "bb832fca-6eae-401d-a27a-14e11ca6f81e"). InnerVolumeSpecName "kube-api-access-m7ngz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:19:30.961153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.961112 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb832fca-6eae-401d-a27a-14e11ca6f81e-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:19:30.961153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.961140 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb832fca-6eae-401d-a27a-14e11ca6f81e-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:19:30.961153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.961154 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb832fca-6eae-401d-a27a-14e11ca6f81e-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:19:30.961449 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:30.961168 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m7ngz\" (UniqueName: \"kubernetes.io/projected/bb832fca-6eae-401d-a27a-14e11ca6f81e-kube-api-access-m7ngz\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:19:31.298980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.298882 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerID="de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7" exitCode=0 Apr 24 19:19:31.298980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.298969 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerDied","Data":"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7"} Apr 24 19:19:31.299486 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.298982 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" Apr 24 19:19:31.299486 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.299001 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc" event={"ID":"bb832fca-6eae-401d-a27a-14e11ca6f81e","Type":"ContainerDied","Data":"bd54d90b5e9bedcc4ac3ad189d2b737238c9efd0a2cbd17e41b3fef004a52c25"} Apr 24 19:19:31.299486 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.299021 2564 scope.go:117] "RemoveContainer" containerID="de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7" Apr 24 19:19:31.308677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.308491 2564 scope.go:117] "RemoveContainer" containerID="8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d" Apr 24 19:19:31.316190 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.316167 2564 scope.go:117] "RemoveContainer" containerID="f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5" Apr 24 19:19:31.319489 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.319463 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:19:31.323189 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.323162 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-9dddf7dc4-qprrc"] Apr 24 19:19:31.324631 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.324614 2564 scope.go:117] "RemoveContainer" containerID="cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419" Apr 24 19:19:31.332174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.332153 2564 scope.go:117] "RemoveContainer" containerID="de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7" Apr 24 19:19:31.332501 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:19:31.332480 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7\": container with ID starting with de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7 not found: ID does not exist" containerID="de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7" Apr 24 19:19:31.332580 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.332511 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7"} err="failed to get container status \"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7\": rpc error: code = NotFound desc = could not find container \"de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7\": container with ID starting with de1bbcaaddc6a82d9464aeaf73f545fbfcb935ddc5457977b5caf3645b8777f7 not found: ID does not exist" Apr 24 19:19:31.332580 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.332534 2564 scope.go:117] "RemoveContainer" containerID="8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d" Apr 24 19:19:31.332767 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:19:31.332752 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d\": container with ID starting with 8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d not found: ID does not exist" containerID="8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d" Apr 24 19:19:31.332819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.332783 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d"} err="failed to get container status \"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d\": rpc error: code = NotFound desc = could not find container \"8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d\": container with ID starting with 8f339815f80e61c18ec1ab27ab4b7046a4342a62aba4ff9a0bcda12f3339a30d not found: ID does not exist" Apr 24 19:19:31.332819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.332799 2564 scope.go:117] "RemoveContainer" containerID="f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5" Apr 24 19:19:31.333016 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:19:31.333001 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5\": container with ID starting with f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5 not found: ID does not exist" containerID="f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5" Apr 24 19:19:31.333061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.333020 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5"} err="failed to get container status \"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5\": rpc error: code = NotFound desc = could not find container \"f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5\": container with ID starting with f370945a1b9ef3de87ce1d0eda1a0bfec8d5dda90dbab86fe3edd71a1367b8c5 not found: ID does not exist" Apr 24 19:19:31.333061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.333033 2564 scope.go:117] "RemoveContainer" containerID="cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419" Apr 24 19:19:31.333210 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:19:31.333196 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419\": container with ID starting with cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419 not found: ID does not exist" containerID="cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419" Apr 24 19:19:31.333247 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:31.333211 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419"} err="failed to get container status \"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419\": rpc error: code = NotFound desc = could not find container \"cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419\": container with ID starting with cd22baa759166911e1bc4cce6e5d3aae194ec573074a9b2365f64e31a9b90419 not found: ID does not exist" Apr 24 19:19:32.237200 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:32.237156 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:32.237721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:32.237695 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:33.185023 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:33.184984 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" path="/var/lib/kubelet/pods/bb832fca-6eae-401d-a27a-14e11ca6f81e/volumes" Apr 24 19:19:42.236983 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:42.236932 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:42.237466 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:42.237379 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:19:52.237030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:52.236979 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:19:52.237441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:19:52.237418 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:02.237210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:02.237164 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:20:02.237682 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:02.237660 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:12.237810 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:12.237711 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:12.238274 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:12.238014 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:25.868195 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.868159 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:20:25.868729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.868520 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" containerID="cri-o://d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6" gracePeriod=30 Apr 24 19:20:25.868729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.868573 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" containerID="cri-o://3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f" gracePeriod=30 Apr 24 19:20:25.868729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.868540 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" containerID="cri-o://ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38" gracePeriod=30 Apr 24 19:20:25.922195 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922164 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:20:25.922456 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922445 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" Apr 24 19:20:25.922495 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922459 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" Apr 24 19:20:25.922495 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922480 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" Apr 24 19:20:25.922495 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922485 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" Apr 24 19:20:25.922495 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922495 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922501 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922509 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="storage-initializer" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922515 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="storage-initializer" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922574 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kube-rbac-proxy" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922583 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="agent" Apr 24 19:20:25.922699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.922590 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb832fca-6eae-401d-a27a-14e11ca6f81e" containerName="kserve-container" Apr 24 19:20:25.925633 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.925615 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:25.928412 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.928395 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 24 19:20:25.929100 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.929084 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 24 19:20:25.939426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:25.939404 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:20:26.079087 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.079051 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.079087 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.079091 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t99df\" (UniqueName: \"kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.079282 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.079120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.179640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.179536 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.179640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.179589 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t99df\" (UniqueName: \"kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.179640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.179621 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.180471 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.180354 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.182403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.182375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.188350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.188319 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t99df\" (UniqueName: \"kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df\") pod \"message-dumper-predictor-c7d86bcbd-zh6vj\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.235929 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.235890 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:26.350959 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.350929 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:20:26.353972 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:20:26.353947 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1338ff04_e7dd_4a97_bccf_b850344909fd.slice/crio-e80f0a8794d33ba5524b575982c8c4dcfecb9ff80b591a1bbaabd9ca5a3db2dd WatchSource:0}: Error finding container e80f0a8794d33ba5524b575982c8c4dcfecb9ff80b591a1bbaabd9ca5a3db2dd: Status 404 returned error can't find the container with id e80f0a8794d33ba5524b575982c8c4dcfecb9ff80b591a1bbaabd9ca5a3db2dd Apr 24 19:20:26.454048 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.454011 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerStarted","Data":"e80f0a8794d33ba5524b575982c8c4dcfecb9ff80b591a1bbaabd9ca5a3db2dd"} Apr 24 19:20:26.456156 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.456129 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bf47579-704a-4869-a77a-44db3e58116f" containerID="3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f" exitCode=2 Apr 24 19:20:26.456271 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:26.456163 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerDied","Data":"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f"} Apr 24 19:20:27.230683 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:27.230644 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:27.460622 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:27.460592 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerStarted","Data":"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11"} Apr 24 19:20:28.465082 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:28.465045 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerStarted","Data":"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716"} Apr 24 19:20:28.465488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:28.465258 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:28.465488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:28.465364 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:28.467146 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:28.467127 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:28.484037 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:28.483993 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" podStartSLOduration=2.442171087 podStartE2EDuration="3.483982923s" podCreationTimestamp="2026-04-24 19:20:25 +0000 UTC" firstStartedPulling="2026-04-24 19:20:26.355686243 +0000 UTC m=+819.679930384" lastFinishedPulling="2026-04-24 19:20:27.397498079 +0000 UTC m=+820.721742220" observedRunningTime="2026-04-24 19:20:28.482787779 +0000 UTC m=+821.807031941" watchObservedRunningTime="2026-04-24 19:20:28.483982923 +0000 UTC m=+821.808227085" Apr 24 19:20:30.472864 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:30.472834 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bf47579-704a-4869-a77a-44db3e58116f" containerID="d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6" exitCode=0 Apr 24 19:20:30.473225 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:30.472908 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerDied","Data":"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6"} Apr 24 19:20:32.230422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:32.230378 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:32.236617 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:32.236585 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:20:32.236902 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:32.236873 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:35.477737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.477706 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:20:35.971308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.971267 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:20:35.974933 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.974916 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:35.977433 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.977405 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 24 19:20:35.977433 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.977418 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 24 19:20:35.984167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:35.984136 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:20:36.063230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.063185 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.063399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.063272 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzlmv\" (UniqueName: \"kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.063399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.063319 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.063399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.063348 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164091 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzlmv\" (UniqueName: \"kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164104 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164132 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164158 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164517 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164498 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.164897 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.164872 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.166597 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.166580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.172072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.172048 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzlmv\" (UniqueName: \"kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv\") pod \"isvc-logger-predictor-5868ff97df-k7dwq\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.285736 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.285634 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:36.409410 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.409383 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:20:36.412102 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:20:36.412068 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod802fff64_5285_4b1c_acb6_851abdd537cf.slice/crio-0a6a3f03592670b1a99bcfe4aba742f59cf9e784debc249057b78901116e01f3 WatchSource:0}: Error finding container 0a6a3f03592670b1a99bcfe4aba742f59cf9e784debc249057b78901116e01f3: Status 404 returned error can't find the container with id 0a6a3f03592670b1a99bcfe4aba742f59cf9e784debc249057b78901116e01f3 Apr 24 19:20:36.490799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.490759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerStarted","Data":"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8"} Apr 24 19:20:36.490799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:36.490802 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerStarted","Data":"0a6a3f03592670b1a99bcfe4aba742f59cf9e784debc249057b78901116e01f3"} Apr 24 19:20:37.231027 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:37.230983 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:37.231221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:37.231114 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:40.502511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:40.502474 2564 generic.go:358] "Generic (PLEG): container finished" podID="802fff64-5285-4b1c-acb6-851abdd537cf" containerID="3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8" exitCode=0 Apr 24 19:20:40.502920 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:40.502531 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerDied","Data":"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8"} Apr 24 19:20:41.507382 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507345 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerStarted","Data":"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93"} Apr 24 19:20:41.507382 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507386 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerStarted","Data":"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934"} Apr 24 19:20:41.507953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507398 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerStarted","Data":"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c"} Apr 24 19:20:41.507953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507759 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:41.507953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507795 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:41.507953 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.507808 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:41.509137 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.509100 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:20:41.509820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.509798 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:41.529084 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:41.529039 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podStartSLOduration=6.529026777 podStartE2EDuration="6.529026777s" podCreationTimestamp="2026-04-24 19:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:20:41.527456473 +0000 UTC m=+834.851700649" watchObservedRunningTime="2026-04-24 19:20:41.529026777 +0000 UTC m=+834.853270939" Apr 24 19:20:42.230850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:42.230808 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:42.237030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:42.236996 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:20:42.237355 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:42.237329 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:42.510965 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:42.510860 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:20:42.511366 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:42.511238 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:47.231166 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:47.231127 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:47.515291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:47.515211 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:20:47.515813 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:47.515780 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:20:47.516235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:47.516211 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:52.230437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:52.230390 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 24 19:20:52.236883 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:52.236841 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 24 19:20:52.237049 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:52.237033 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:52.237262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:52.237236 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:20:52.237361 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:52.237349 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:56.013209 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.013185 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:56.123812 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.123724 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location\") pod \"3bf47579-704a-4869-a77a-44db3e58116f\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " Apr 24 19:20:56.123812 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.123773 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmk7n\" (UniqueName: \"kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n\") pod \"3bf47579-704a-4869-a77a-44db3e58116f\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " Apr 24 19:20:56.123998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.123836 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls\") pod \"3bf47579-704a-4869-a77a-44db3e58116f\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " Apr 24 19:20:56.123998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.123884 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"3bf47579-704a-4869-a77a-44db3e58116f\" (UID: \"3bf47579-704a-4869-a77a-44db3e58116f\") " Apr 24 19:20:56.124138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.124117 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3bf47579-704a-4869-a77a-44db3e58116f" (UID: "3bf47579-704a-4869-a77a-44db3e58116f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:20:56.124319 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.124295 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "3bf47579-704a-4869-a77a-44db3e58116f" (UID: "3bf47579-704a-4869-a77a-44db3e58116f"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:20:56.126043 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.126011 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n" (OuterVolumeSpecName: "kube-api-access-bmk7n") pod "3bf47579-704a-4869-a77a-44db3e58116f" (UID: "3bf47579-704a-4869-a77a-44db3e58116f"). InnerVolumeSpecName "kube-api-access-bmk7n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:20:56.126145 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.126012 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3bf47579-704a-4869-a77a-44db3e58116f" (UID: "3bf47579-704a-4869-a77a-44db3e58116f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:20:56.224785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.224745 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3bf47579-704a-4869-a77a-44db3e58116f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:20:56.224785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.224770 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3bf47579-704a-4869-a77a-44db3e58116f-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:20:56.224785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.224789 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3bf47579-704a-4869-a77a-44db3e58116f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:20:56.225003 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.224800 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bmk7n\" (UniqueName: \"kubernetes.io/projected/3bf47579-704a-4869-a77a-44db3e58116f-kube-api-access-bmk7n\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:20:56.552212 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.552174 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bf47579-704a-4869-a77a-44db3e58116f" containerID="ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38" exitCode=0 Apr 24 19:20:56.552362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.552239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerDied","Data":"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38"} Apr 24 19:20:56.552362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.552269 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" event={"ID":"3bf47579-704a-4869-a77a-44db3e58116f","Type":"ContainerDied","Data":"588bbdcc4e7e99abf78617c1e7ed72cf614d8d445fc34e8ffdc283538f52e88e"} Apr 24 19:20:56.552362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.552277 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r" Apr 24 19:20:56.552362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.552290 2564 scope.go:117] "RemoveContainer" containerID="ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38" Apr 24 19:20:56.561029 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.560922 2564 scope.go:117] "RemoveContainer" containerID="3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f" Apr 24 19:20:56.568010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.567994 2564 scope.go:117] "RemoveContainer" containerID="d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6" Apr 24 19:20:56.575152 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.575118 2564 scope.go:117] "RemoveContainer" containerID="cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650" Apr 24 19:20:56.576721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.576697 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:20:56.581051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.581026 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-67d58bf66c-qnb4r"] Apr 24 19:20:56.582284 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.582271 2564 scope.go:117] "RemoveContainer" containerID="ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38" Apr 24 19:20:56.582523 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:20:56.582502 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38\": container with ID starting with ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38 not found: ID does not exist" containerID="ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38" Apr 24 19:20:56.582613 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.582535 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38"} err="failed to get container status \"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38\": rpc error: code = NotFound desc = could not find container \"ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38\": container with ID starting with ee56509bf5c48eb732bfcd831ada34f465e4bb87856945aa53c89f3b96345b38 not found: ID does not exist" Apr 24 19:20:56.582613 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.582578 2564 scope.go:117] "RemoveContainer" containerID="3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f" Apr 24 19:20:56.582812 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:20:56.582792 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f\": container with ID starting with 3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f not found: ID does not exist" containerID="3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f" Apr 24 19:20:56.582849 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.582819 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f"} err="failed to get container status \"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f\": rpc error: code = NotFound desc = could not find container \"3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f\": container with ID starting with 3aa569db33d6fd760323beb6946460e4a59cf08de27483f75e8d232ea5dc1c3f not found: ID does not exist" Apr 24 19:20:56.582849 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.582837 2564 scope.go:117] "RemoveContainer" containerID="d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6" Apr 24 19:20:56.583027 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:20:56.583011 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6\": container with ID starting with d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6 not found: ID does not exist" containerID="d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6" Apr 24 19:20:56.583064 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.583030 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6"} err="failed to get container status \"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6\": rpc error: code = NotFound desc = could not find container \"d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6\": container with ID starting with d441e5792bdc5bfbb5c034ba33e507c074b3f761d26419bab5172e20add6c1d6 not found: ID does not exist" Apr 24 19:20:56.583064 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.583044 2564 scope.go:117] "RemoveContainer" containerID="cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650" Apr 24 19:20:56.583219 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:20:56.583204 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650\": container with ID starting with cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650 not found: ID does not exist" containerID="cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650" Apr 24 19:20:56.583253 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:56.583222 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650"} err="failed to get container status \"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650\": rpc error: code = NotFound desc = could not find container \"cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650\": container with ID starting with cb6572e3cc86c22b726bdf68e6e20333c1ae218086163dc659a5d46266fe1650 not found: ID does not exist" Apr 24 19:20:57.185177 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:57.185148 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf47579-704a-4869-a77a-44db3e58116f" path="/var/lib/kubelet/pods/3bf47579-704a-4869-a77a-44db3e58116f/volumes" Apr 24 19:20:57.515778 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:57.515735 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:20:57.516172 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:20:57.516135 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:07.516415 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:07.516368 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:21:07.516969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:07.516829 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:17.516456 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:17.516405 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:21:17.516968 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:17.516938 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:27.516209 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:27.516159 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:21:27.516676 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:27.516545 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:37.516593 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:37.516484 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:21:37.517024 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:37.516996 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:21:47.516744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:47.516713 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:21:47.517176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:21:47.516997 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:00.978256 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:00.978220 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-zh6vj_1338ff04-e7dd-4a97-bccf-b850344909fd/kserve-container/0.log" Apr 24 19:22:01.116771 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.116738 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:22:01.117164 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.117105 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kserve-container" containerID="cri-o://015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" gracePeriod=30 Apr 24 19:22:01.117299 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.117138 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kube-rbac-proxy" containerID="cri-o://b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" gracePeriod=30 Apr 24 19:22:01.187231 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.187190 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:22:01.187637 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.187589 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" containerID="cri-o://9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c" gracePeriod=30 Apr 24 19:22:01.187637 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.187608 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" containerID="cri-o://10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93" gracePeriod=30 Apr 24 19:22:01.187885 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.187648 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" containerID="cri-o://469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934" gracePeriod=30 Apr 24 19:22:01.208045 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208018 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:22:01.208316 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208305 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="storage-initializer" Apr 24 19:22:01.208358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208319 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="storage-initializer" Apr 24 19:22:01.208358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208329 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" Apr 24 19:22:01.208358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208334 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" Apr 24 19:22:01.208358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208342 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" Apr 24 19:22:01.208358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208348 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" Apr 24 19:22:01.208504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208367 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" Apr 24 19:22:01.208504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208372 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" Apr 24 19:22:01.208504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208412 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kube-rbac-proxy" Apr 24 19:22:01.208504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208420 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="kserve-container" Apr 24 19:22:01.208504 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.208428 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bf47579-704a-4869-a77a-44db3e58116f" containerName="agent" Apr 24 19:22:01.211518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.211502 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.213808 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.213788 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 24 19:22:01.213923 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.213875 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 19:22:01.223839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.223812 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:22:01.236809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.236710 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.236809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.236757 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.236809 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.236804 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.237013 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.236898 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xqlp\" (UniqueName: \"kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.337239 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.337206 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.337239 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.337251 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.337431 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.337279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.337431 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.337316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6xqlp\" (UniqueName: \"kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.337662 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:01.337630 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 19:22:01.337789 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:01.337702 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls podName:6092c070-c1a5-4043-a9b1-8370a7de0fee nodeName:}" failed. No retries permitted until 2026-04-24 19:22:01.837678889 +0000 UTC m=+915.161923048 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-gvps4" (UID: "6092c070-c1a5-4043-a9b1-8370a7de0fee") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 24 19:22:01.337871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.337852 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.338167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.338147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.349302 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.349277 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xqlp\" (UniqueName: \"kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.420283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.420258 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:22:01.437635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.437606 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t99df\" (UniqueName: \"kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df\") pod \"1338ff04-e7dd-4a97-bccf-b850344909fd\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " Apr 24 19:22:01.437776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.437670 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls\") pod \"1338ff04-e7dd-4a97-bccf-b850344909fd\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " Apr 24 19:22:01.437776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.437695 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config\") pod \"1338ff04-e7dd-4a97-bccf-b850344909fd\" (UID: \"1338ff04-e7dd-4a97-bccf-b850344909fd\") " Apr 24 19:22:01.438125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.438077 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "1338ff04-e7dd-4a97-bccf-b850344909fd" (UID: "1338ff04-e7dd-4a97-bccf-b850344909fd"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:22:01.440367 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.440341 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df" (OuterVolumeSpecName: "kube-api-access-t99df") pod "1338ff04-e7dd-4a97-bccf-b850344909fd" (UID: "1338ff04-e7dd-4a97-bccf-b850344909fd"). InnerVolumeSpecName "kube-api-access-t99df". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:22:01.440482 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.440341 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1338ff04-e7dd-4a97-bccf-b850344909fd" (UID: "1338ff04-e7dd-4a97-bccf-b850344909fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:22:01.539180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.539075 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t99df\" (UniqueName: \"kubernetes.io/projected/1338ff04-e7dd-4a97-bccf-b850344909fd-kube-api-access-t99df\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:01.539180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.539116 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1338ff04-e7dd-4a97-bccf-b850344909fd-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:01.539180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.539130 2564 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1338ff04-e7dd-4a97-bccf-b850344909fd-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:01.740329 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740289 2564 generic.go:358] "Generic (PLEG): container finished" podID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerID="b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" exitCode=2 Apr 24 19:22:01.740329 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740318 2564 generic.go:358] "Generic (PLEG): container finished" podID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerID="015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" exitCode=2 Apr 24 19:22:01.740537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740363 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" Apr 24 19:22:01.740537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740376 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerDied","Data":"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716"} Apr 24 19:22:01.740537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740410 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerDied","Data":"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11"} Apr 24 19:22:01.740537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740420 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj" event={"ID":"1338ff04-e7dd-4a97-bccf-b850344909fd","Type":"ContainerDied","Data":"e80f0a8794d33ba5524b575982c8c4dcfecb9ff80b591a1bbaabd9ca5a3db2dd"} Apr 24 19:22:01.740537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.740433 2564 scope.go:117] "RemoveContainer" containerID="b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" Apr 24 19:22:01.742599 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.742571 2564 generic.go:358] "Generic (PLEG): container finished" podID="802fff64-5285-4b1c-acb6-851abdd537cf" containerID="469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934" exitCode=2 Apr 24 19:22:01.742710 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.742603 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerDied","Data":"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934"} Apr 24 19:22:01.748297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.748278 2564 scope.go:117] "RemoveContainer" containerID="015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" Apr 24 19:22:01.755421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.755402 2564 scope.go:117] "RemoveContainer" containerID="b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" Apr 24 19:22:01.755708 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:01.755685 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716\": container with ID starting with b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716 not found: ID does not exist" containerID="b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" Apr 24 19:22:01.755759 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.755717 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716"} err="failed to get container status \"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716\": rpc error: code = NotFound desc = could not find container \"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716\": container with ID starting with b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716 not found: ID does not exist" Apr 24 19:22:01.755759 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.755734 2564 scope.go:117] "RemoveContainer" containerID="015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" Apr 24 19:22:01.755955 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:01.755937 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11\": container with ID starting with 015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11 not found: ID does not exist" containerID="015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" Apr 24 19:22:01.755996 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.755961 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11"} err="failed to get container status \"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11\": rpc error: code = NotFound desc = could not find container \"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11\": container with ID starting with 015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11 not found: ID does not exist" Apr 24 19:22:01.755996 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.755976 2564 scope.go:117] "RemoveContainer" containerID="b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716" Apr 24 19:22:01.756220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.756201 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716"} err="failed to get container status \"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716\": rpc error: code = NotFound desc = could not find container \"b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716\": container with ID starting with b1ef4f1076a1442b2f2854bd236c8433ea900ae59582b70ef75a36fdd784e716 not found: ID does not exist" Apr 24 19:22:01.756269 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.756221 2564 scope.go:117] "RemoveContainer" containerID="015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11" Apr 24 19:22:01.756421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.756405 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11"} err="failed to get container status \"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11\": rpc error: code = NotFound desc = could not find container \"015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11\": container with ID starting with 015620e4a40f149cb9cd025e1236efa02fa59745bdfc293b3cc5b8edfcbcab11 not found: ID does not exist" Apr 24 19:22:01.761824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.761790 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:22:01.765085 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.765065 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-zh6vj"] Apr 24 19:22:01.840219 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.840127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:01.842498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:01.842477 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-gvps4\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:02.121302 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:02.121203 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:02.246584 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:02.246537 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:22:02.249417 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:22:02.249386 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6092c070_c1a5_4043_a9b1_8370a7de0fee.slice/crio-05c342cfe6cd6c9f112d7a35cc1a584755370a1e5eac1de3db6931cfefde53d4 WatchSource:0}: Error finding container 05c342cfe6cd6c9f112d7a35cc1a584755370a1e5eac1de3db6931cfefde53d4: Status 404 returned error can't find the container with id 05c342cfe6cd6c9f112d7a35cc1a584755370a1e5eac1de3db6931cfefde53d4 Apr 24 19:22:02.511723 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:02.511665 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:02.747583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:02.747530 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerStarted","Data":"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144"} Apr 24 19:22:02.747583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:02.747583 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerStarted","Data":"05c342cfe6cd6c9f112d7a35cc1a584755370a1e5eac1de3db6931cfefde53d4"} Apr 24 19:22:03.185308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:03.185269 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" path="/var/lib/kubelet/pods/1338ff04-e7dd-4a97-bccf-b850344909fd/volumes" Apr 24 19:22:05.758352 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:05.758314 2564 generic.go:358] "Generic (PLEG): container finished" podID="802fff64-5285-4b1c-acb6-851abdd537cf" containerID="9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c" exitCode=0 Apr 24 19:22:05.758730 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:05.758384 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerDied","Data":"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c"} Apr 24 19:22:06.762859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:06.762825 2564 generic.go:358] "Generic (PLEG): container finished" podID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerID="9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144" exitCode=0 Apr 24 19:22:06.763243 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:06.762899 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerDied","Data":"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144"} Apr 24 19:22:07.511879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:07.511785 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:07.516465 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:07.516416 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:22:07.516838 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:07.516793 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:22:12.511469 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:12.511421 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:12.511949 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:12.511593 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:13.788880 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:13.788846 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerStarted","Data":"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781"} Apr 24 19:22:13.789378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:13.788891 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerStarted","Data":"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29"} Apr 24 19:22:13.789378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:13.789092 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:13.808589 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:13.808520 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podStartSLOduration=6.642600808 podStartE2EDuration="12.808505377s" podCreationTimestamp="2026-04-24 19:22:01 +0000 UTC" firstStartedPulling="2026-04-24 19:22:06.764268821 +0000 UTC m=+920.088512963" lastFinishedPulling="2026-04-24 19:22:12.930173391 +0000 UTC m=+926.254417532" observedRunningTime="2026-04-24 19:22:13.807237689 +0000 UTC m=+927.131481854" watchObservedRunningTime="2026-04-24 19:22:13.808505377 +0000 UTC m=+927.132749541" Apr 24 19:22:14.791772 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:14.791735 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:14.792863 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:14.792836 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:22:15.794087 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:15.794041 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:22:17.511008 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:17.510959 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:17.516446 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:17.516407 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:22:17.516755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:17.516716 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:22:20.798446 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:20.798412 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:22:20.799065 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:20.799035 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:22:22.511992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:22.511941 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:27.511572 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:27.511506 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 24 19:22:27.515963 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:27.515925 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 24 19:22:27.516085 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:27.516071 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:27.516283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:27.516260 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 19:22:27.516371 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:27.516360 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:30.799477 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:30.799436 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:22:31.330006 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.329981 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:31.365398 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365360 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls\") pod \"802fff64-5285-4b1c-acb6-851abdd537cf\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " Apr 24 19:22:31.365606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365435 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzlmv\" (UniqueName: \"kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv\") pod \"802fff64-5285-4b1c-acb6-851abdd537cf\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " Apr 24 19:22:31.365606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365468 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config\") pod \"802fff64-5285-4b1c-acb6-851abdd537cf\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " Apr 24 19:22:31.365606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365493 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location\") pod \"802fff64-5285-4b1c-acb6-851abdd537cf\" (UID: \"802fff64-5285-4b1c-acb6-851abdd537cf\") " Apr 24 19:22:31.365864 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365831 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "802fff64-5285-4b1c-acb6-851abdd537cf" (UID: "802fff64-5285-4b1c-acb6-851abdd537cf"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:22:31.365970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.365871 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "802fff64-5285-4b1c-acb6-851abdd537cf" (UID: "802fff64-5285-4b1c-acb6-851abdd537cf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:22:31.367536 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.367512 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "802fff64-5285-4b1c-acb6-851abdd537cf" (UID: "802fff64-5285-4b1c-acb6-851abdd537cf"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:22:31.367536 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.367523 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv" (OuterVolumeSpecName: "kube-api-access-wzlmv") pod "802fff64-5285-4b1c-acb6-851abdd537cf" (UID: "802fff64-5285-4b1c-acb6-851abdd537cf"). InnerVolumeSpecName "kube-api-access-wzlmv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:22:31.466897 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.466863 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wzlmv\" (UniqueName: \"kubernetes.io/projected/802fff64-5285-4b1c-acb6-851abdd537cf-kube-api-access-wzlmv\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:31.466897 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.466895 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/802fff64-5285-4b1c-acb6-851abdd537cf-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:31.466897 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.466906 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/802fff64-5285-4b1c-acb6-851abdd537cf-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:31.467119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.466916 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/802fff64-5285-4b1c-acb6-851abdd537cf-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:22:31.838487 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.838398 2564 generic.go:358] "Generic (PLEG): container finished" podID="802fff64-5285-4b1c-acb6-851abdd537cf" containerID="10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93" exitCode=137 Apr 24 19:22:31.838888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.838491 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" Apr 24 19:22:31.838888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.838489 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerDied","Data":"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93"} Apr 24 19:22:31.838888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.838603 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq" event={"ID":"802fff64-5285-4b1c-acb6-851abdd537cf","Type":"ContainerDied","Data":"0a6a3f03592670b1a99bcfe4aba742f59cf9e784debc249057b78901116e01f3"} Apr 24 19:22:31.838888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.838623 2564 scope.go:117] "RemoveContainer" containerID="10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93" Apr 24 19:22:31.848203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.848178 2564 scope.go:117] "RemoveContainer" containerID="469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934" Apr 24 19:22:31.855538 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.855515 2564 scope.go:117] "RemoveContainer" containerID="9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c" Apr 24 19:22:31.862885 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.862853 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:22:31.863683 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.863661 2564 scope.go:117] "RemoveContainer" containerID="3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8" Apr 24 19:22:31.866192 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.866168 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5868ff97df-k7dwq"] Apr 24 19:22:31.872083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872062 2564 scope.go:117] "RemoveContainer" containerID="10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93" Apr 24 19:22:31.872370 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:31.872350 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93\": container with ID starting with 10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93 not found: ID does not exist" containerID="10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93" Apr 24 19:22:31.872437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872379 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93"} err="failed to get container status \"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93\": rpc error: code = NotFound desc = could not find container \"10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93\": container with ID starting with 10dcb8c3ad6dc4d3a5506e0acceaef01f9d8365cba58266cfaa4f17b4a528c93 not found: ID does not exist" Apr 24 19:22:31.872437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872398 2564 scope.go:117] "RemoveContainer" containerID="469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934" Apr 24 19:22:31.872690 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:31.872673 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934\": container with ID starting with 469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934 not found: ID does not exist" containerID="469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934" Apr 24 19:22:31.872752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872695 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934"} err="failed to get container status \"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934\": rpc error: code = NotFound desc = could not find container \"469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934\": container with ID starting with 469be47e300770797359a3e1d17f9e4af6965a7b6f71d9b30d1bc63c6335b934 not found: ID does not exist" Apr 24 19:22:31.872752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872715 2564 scope.go:117] "RemoveContainer" containerID="9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c" Apr 24 19:22:31.872965 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:31.872939 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c\": container with ID starting with 9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c not found: ID does not exist" containerID="9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c" Apr 24 19:22:31.873008 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872969 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c"} err="failed to get container status \"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c\": rpc error: code = NotFound desc = could not find container \"9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c\": container with ID starting with 9b7e471f4e93667f774533d98b7f9ba024f4590ef832d9056681917920574a9c not found: ID does not exist" Apr 24 19:22:31.873008 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.872982 2564 scope.go:117] "RemoveContainer" containerID="3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8" Apr 24 19:22:31.873211 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:22:31.873194 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8\": container with ID starting with 3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8 not found: ID does not exist" containerID="3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8" Apr 24 19:22:31.873262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:31.873214 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8"} err="failed to get container status \"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8\": rpc error: code = NotFound desc = could not find container \"3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8\": container with ID starting with 3a5c997a6fb7e603df1652ec39db86903ffddb67978b1c557a708856fa0b7aa8 not found: ID does not exist" Apr 24 19:22:33.183946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:33.183909 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" path="/var/lib/kubelet/pods/802fff64-5285-4b1c-acb6-851abdd537cf/volumes" Apr 24 19:22:40.798978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:40.798937 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:22:50.800002 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:22:50.799957 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:00.799045 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:00.799003 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:10.799073 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:10.798982 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:20.799982 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:20.799941 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:25.180307 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:25.180266 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:35.183881 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:35.183853 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:23:41.353246 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.353207 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:23:41.353790 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.353621 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" containerID="cri-o://39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29" gracePeriod=30 Apr 24 19:23:41.353790 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.353777 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kube-rbac-proxy" containerID="cri-o://f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781" gracePeriod=30 Apr 24 19:23:41.491829 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.491793 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:23:41.492091 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492079 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" Apr 24 19:23:41.492134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492094 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" Apr 24 19:23:41.492134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492103 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kube-rbac-proxy" Apr 24 19:23:41.492134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492108 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kube-rbac-proxy" Apr 24 19:23:41.492134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492130 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kserve-container" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492139 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kserve-container" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492147 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="storage-initializer" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492152 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="storage-initializer" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492160 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492165 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492172 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492177 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492232 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kserve-container" Apr 24 19:23:41.492250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492244 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kserve-container" Apr 24 19:23:41.492522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492254 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="agent" Apr 24 19:23:41.492522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492261 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1338ff04-e7dd-4a97-bccf-b850344909fd" containerName="kube-rbac-proxy" Apr 24 19:23:41.492522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.492267 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="802fff64-5285-4b1c-acb6-851abdd537cf" containerName="kube-rbac-proxy" Apr 24 19:23:41.495163 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.495148 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.497510 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.497484 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 24 19:23:41.497681 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.497537 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:23:41.503017 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.502995 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:23:41.605030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.604939 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jv8gz\" (UniqueName: \"kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.605030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.604988 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.605206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.605050 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.605206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.605120 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.706419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.706366 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jv8gz\" (UniqueName: \"kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.706419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.706427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.706684 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.706462 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.706684 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.706498 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.706939 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.706909 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.707167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.707148 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.708958 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.708939 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.714206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.714169 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jv8gz\" (UniqueName: \"kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.805053 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.805016 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:41.923463 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:41.923425 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:23:41.926750 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:23:41.926717 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05101861_d839_47ed_a635_7210ff57c7a4.slice/crio-66df70730bf21366233d1d9d01ad79322c87d317913ceec171183608da999e77 WatchSource:0}: Error finding container 66df70730bf21366233d1d9d01ad79322c87d317913ceec171183608da999e77: Status 404 returned error can't find the container with id 66df70730bf21366233d1d9d01ad79322c87d317913ceec171183608da999e77 Apr 24 19:23:42.036407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:42.036369 2564 generic.go:358] "Generic (PLEG): container finished" podID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerID="f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781" exitCode=2 Apr 24 19:23:42.036607 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:42.036443 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerDied","Data":"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781"} Apr 24 19:23:42.037853 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:42.037831 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerStarted","Data":"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63"} Apr 24 19:23:42.037950 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:42.037859 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerStarted","Data":"66df70730bf21366233d1d9d01ad79322c87d317913ceec171183608da999e77"} Apr 24 19:23:45.180384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:45.180337 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 24 19:23:45.794869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:45.794823 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 24 19:23:45.994188 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:45.994159 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:23:46.040291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040207 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xqlp\" (UniqueName: \"kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp\") pod \"6092c070-c1a5-4043-a9b1-8370a7de0fee\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " Apr 24 19:23:46.040291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040251 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"6092c070-c1a5-4043-a9b1-8370a7de0fee\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " Apr 24 19:23:46.040516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040294 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") pod \"6092c070-c1a5-4043-a9b1-8370a7de0fee\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " Apr 24 19:23:46.040516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040343 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location\") pod \"6092c070-c1a5-4043-a9b1-8370a7de0fee\" (UID: \"6092c070-c1a5-4043-a9b1-8370a7de0fee\") " Apr 24 19:23:46.040679 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040649 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "6092c070-c1a5-4043-a9b1-8370a7de0fee" (UID: "6092c070-c1a5-4043-a9b1-8370a7de0fee"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:23:46.040742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.040715 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6092c070-c1a5-4043-a9b1-8370a7de0fee" (UID: "6092c070-c1a5-4043-a9b1-8370a7de0fee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:23:46.042310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.042287 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6092c070-c1a5-4043-a9b1-8370a7de0fee" (UID: "6092c070-c1a5-4043-a9b1-8370a7de0fee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:23:46.042386 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.042339 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp" (OuterVolumeSpecName: "kube-api-access-6xqlp") pod "6092c070-c1a5-4043-a9b1-8370a7de0fee" (UID: "6092c070-c1a5-4043-a9b1-8370a7de0fee"). InnerVolumeSpecName "kube-api-access-6xqlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:23:46.049061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.049034 2564 generic.go:358] "Generic (PLEG): container finished" podID="05101861-d839-47ed-a635-7210ff57c7a4" containerID="d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63" exitCode=0 Apr 24 19:23:46.049191 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.049113 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerDied","Data":"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63"} Apr 24 19:23:46.050870 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.050849 2564 generic.go:358] "Generic (PLEG): container finished" podID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerID="39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29" exitCode=0 Apr 24 19:23:46.050954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.050891 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerDied","Data":"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29"} Apr 24 19:23:46.050954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.050912 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" event={"ID":"6092c070-c1a5-4043-a9b1-8370a7de0fee","Type":"ContainerDied","Data":"05c342cfe6cd6c9f112d7a35cc1a584755370a1e5eac1de3db6931cfefde53d4"} Apr 24 19:23:46.050954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.050922 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4" Apr 24 19:23:46.051072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.050932 2564 scope.go:117] "RemoveContainer" containerID="f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781" Apr 24 19:23:46.058640 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.058527 2564 scope.go:117] "RemoveContainer" containerID="39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29" Apr 24 19:23:46.065476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.065453 2564 scope.go:117] "RemoveContainer" containerID="9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144" Apr 24 19:23:46.076209 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.074795 2564 scope.go:117] "RemoveContainer" containerID="f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781" Apr 24 19:23:46.076310 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:23:46.076199 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781\": container with ID starting with f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781 not found: ID does not exist" containerID="f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781" Apr 24 19:23:46.076310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.076232 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781"} err="failed to get container status \"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781\": rpc error: code = NotFound desc = could not find container \"f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781\": container with ID starting with f5a6614caaa2e3bba695592a38d5837f7cb7ddb27e846b60a99a4a146c62f781 not found: ID does not exist" Apr 24 19:23:46.076310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.076256 2564 scope.go:117] "RemoveContainer" containerID="39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29" Apr 24 19:23:46.076776 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:23:46.076752 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29\": container with ID starting with 39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29 not found: ID does not exist" containerID="39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29" Apr 24 19:23:46.076904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.076781 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29"} err="failed to get container status \"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29\": rpc error: code = NotFound desc = could not find container \"39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29\": container with ID starting with 39cd57f5565d1a769a37ef5698c71b4f1e568c1ee9f1c99aee545ff7eadfdd29 not found: ID does not exist" Apr 24 19:23:46.076904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.076813 2564 scope.go:117] "RemoveContainer" containerID="9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144" Apr 24 19:23:46.077131 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:23:46.077112 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144\": container with ID starting with 9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144 not found: ID does not exist" containerID="9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144" Apr 24 19:23:46.077173 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.077135 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144"} err="failed to get container status \"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144\": rpc error: code = NotFound desc = could not find container \"9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144\": container with ID starting with 9c58c76ac0b4309c9a3da385d5d9f241e7aab80866a7d180954a59241126b144 not found: ID does not exist" Apr 24 19:23:46.097761 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.097731 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:23:46.112145 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.112123 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-gvps4"] Apr 24 19:23:46.141375 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.141348 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6xqlp\" (UniqueName: \"kubernetes.io/projected/6092c070-c1a5-4043-a9b1-8370a7de0fee-kube-api-access-6xqlp\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:23:46.141375 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.141372 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6092c070-c1a5-4043-a9b1-8370a7de0fee-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:23:46.141545 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.141391 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6092c070-c1a5-4043-a9b1-8370a7de0fee-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:23:46.141545 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:46.141402 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6092c070-c1a5-4043-a9b1-8370a7de0fee-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:23:47.055799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:47.055757 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerStarted","Data":"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f"} Apr 24 19:23:47.056229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:47.055813 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerStarted","Data":"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6"} Apr 24 19:23:47.056229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:47.056190 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:47.078634 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:47.078545 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podStartSLOduration=6.078529543 podStartE2EDuration="6.078529543s" podCreationTimestamp="2026-04-24 19:23:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:23:47.076996717 +0000 UTC m=+1020.401240902" watchObservedRunningTime="2026-04-24 19:23:47.078529543 +0000 UTC m=+1020.402773707" Apr 24 19:23:47.185449 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:47.185412 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" path="/var/lib/kubelet/pods/6092c070-c1a5-4043-a9b1-8370a7de0fee/volumes" Apr 24 19:23:48.059437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:48.059396 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:48.060648 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:48.060617 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:23:49.062403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:49.062360 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:23:54.066869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:54.066833 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:23:54.067424 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:23:54.067396 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:04.067393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:04.067349 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:14.067855 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:14.067814 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:24.068194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:24.068150 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:34.067571 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:34.067521 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:44.067406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:44.067317 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:24:54.067397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:24:54.067353 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:25:04.067707 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:04.067676 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:25:11.930047 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:11.930005 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:25:11.930510 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:11.930367 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" containerID="cri-o://aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6" gracePeriod=30 Apr 24 19:25:11.930510 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:11.930428 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kube-rbac-proxy" containerID="cri-o://a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f" gracePeriod=30 Apr 24 19:25:12.036977 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.036939 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:25:12.037308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037269 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kube-rbac-proxy" Apr 24 19:25:12.037308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037288 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kube-rbac-proxy" Apr 24 19:25:12.037308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037299 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" Apr 24 19:25:12.037308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037306 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" Apr 24 19:25:12.037472 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037333 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="storage-initializer" Apr 24 19:25:12.037472 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037342 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="storage-initializer" Apr 24 19:25:12.037472 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037401 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kube-rbac-proxy" Apr 24 19:25:12.037472 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.037412 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6092c070-c1a5-4043-a9b1-8370a7de0fee" containerName="kserve-container" Apr 24 19:25:12.040426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.040403 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.042769 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.042745 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 24 19:25:12.043202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.043182 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:25:12.049378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.049347 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:25:12.111850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.111814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4x44\" (UniqueName: \"kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.112050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.111860 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.112050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.111932 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.112050 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.111979 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213080 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4x44\" (UniqueName: \"kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213133 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213169 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213189 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213641 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213617 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.213945 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.213917 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.215810 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.215792 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.221037 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.221003 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4x44\" (UniqueName: \"kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.285009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.284966 2564 generic.go:358] "Generic (PLEG): container finished" podID="05101861-d839-47ed-a635-7210ff57c7a4" containerID="a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f" exitCode=2 Apr 24 19:25:12.285175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.285024 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerDied","Data":"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f"} Apr 24 19:25:12.352176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.352144 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:25:12.478493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.478446 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:25:12.481214 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:25:12.481179 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod160233ed_3114_4e24_9fae_a0ca826d8676.slice/crio-41614bce035d7064c5e3972dd98f887a92091230592fee64a942cf863ddaec5f WatchSource:0}: Error finding container 41614bce035d7064c5e3972dd98f887a92091230592fee64a942cf863ddaec5f: Status 404 returned error can't find the container with id 41614bce035d7064c5e3972dd98f887a92091230592fee64a942cf863ddaec5f Apr 24 19:25:12.483493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:12.483469 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:25:13.288677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:13.288639 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerStarted","Data":"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a"} Apr 24 19:25:13.288677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:13.288678 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerStarted","Data":"41614bce035d7064c5e3972dd98f887a92091230592fee64a942cf863ddaec5f"} Apr 24 19:25:14.063320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:14.063274 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 24 19:25:14.067668 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:14.067639 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 24 19:25:16.581701 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.581673 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:25:16.646159 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646124 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location\") pod \"05101861-d839-47ed-a635-7210ff57c7a4\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " Apr 24 19:25:16.646159 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646158 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls\") pod \"05101861-d839-47ed-a635-7210ff57c7a4\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " Apr 24 19:25:16.646360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646195 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jv8gz\" (UniqueName: \"kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz\") pod \"05101861-d839-47ed-a635-7210ff57c7a4\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " Apr 24 19:25:16.646360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646227 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"05101861-d839-47ed-a635-7210ff57c7a4\" (UID: \"05101861-d839-47ed-a635-7210ff57c7a4\") " Apr 24 19:25:16.646503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646481 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "05101861-d839-47ed-a635-7210ff57c7a4" (UID: "05101861-d839-47ed-a635-7210ff57c7a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:25:16.646618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.646595 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "05101861-d839-47ed-a635-7210ff57c7a4" (UID: "05101861-d839-47ed-a635-7210ff57c7a4"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:25:16.648305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.648277 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "05101861-d839-47ed-a635-7210ff57c7a4" (UID: "05101861-d839-47ed-a635-7210ff57c7a4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:25:16.648305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.648279 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz" (OuterVolumeSpecName: "kube-api-access-jv8gz") pod "05101861-d839-47ed-a635-7210ff57c7a4" (UID: "05101861-d839-47ed-a635-7210ff57c7a4"). InnerVolumeSpecName "kube-api-access-jv8gz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:25:16.746930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.746890 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/05101861-d839-47ed-a635-7210ff57c7a4-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:25:16.746930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.746923 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/05101861-d839-47ed-a635-7210ff57c7a4-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:25:16.746930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.746933 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jv8gz\" (UniqueName: \"kubernetes.io/projected/05101861-d839-47ed-a635-7210ff57c7a4-kube-api-access-jv8gz\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:25:16.747248 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:16.746943 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/05101861-d839-47ed-a635-7210ff57c7a4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:25:17.300816 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.300771 2564 generic.go:358] "Generic (PLEG): container finished" podID="160233ed-3114-4e24-9fae-a0ca826d8676" containerID="856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a" exitCode=0 Apr 24 19:25:17.301020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.300843 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerDied","Data":"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a"} Apr 24 19:25:17.302601 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.302574 2564 generic.go:358] "Generic (PLEG): container finished" podID="05101861-d839-47ed-a635-7210ff57c7a4" containerID="aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6" exitCode=0 Apr 24 19:25:17.302727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.302665 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerDied","Data":"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6"} Apr 24 19:25:17.302727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.302690 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" Apr 24 19:25:17.302727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.302707 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r" event={"ID":"05101861-d839-47ed-a635-7210ff57c7a4","Type":"ContainerDied","Data":"66df70730bf21366233d1d9d01ad79322c87d317913ceec171183608da999e77"} Apr 24 19:25:17.302727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.302725 2564 scope.go:117] "RemoveContainer" containerID="a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f" Apr 24 19:25:17.312140 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.312121 2564 scope.go:117] "RemoveContainer" containerID="aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6" Apr 24 19:25:17.319162 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.319142 2564 scope.go:117] "RemoveContainer" containerID="d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63" Apr 24 19:25:17.326445 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.326427 2564 scope.go:117] "RemoveContainer" containerID="a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f" Apr 24 19:25:17.326733 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:25:17.326714 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f\": container with ID starting with a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f not found: ID does not exist" containerID="a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f" Apr 24 19:25:17.326789 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.326742 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f"} err="failed to get container status \"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f\": rpc error: code = NotFound desc = could not find container \"a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f\": container with ID starting with a033c4a3fd8cfc2b080d054013d30bd637c608051fe27be4e916ac4af5fcb82f not found: ID does not exist" Apr 24 19:25:17.326789 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.326763 2564 scope.go:117] "RemoveContainer" containerID="aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6" Apr 24 19:25:17.327015 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:25:17.326993 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6\": container with ID starting with aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6 not found: ID does not exist" containerID="aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6" Apr 24 19:25:17.327065 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.327021 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6"} err="failed to get container status \"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6\": rpc error: code = NotFound desc = could not find container \"aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6\": container with ID starting with aeb7aadf0e96e63493d2964a5a3a02032c212a7528e83e2b5b641ac353f94aa6 not found: ID does not exist" Apr 24 19:25:17.327065 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.327037 2564 scope.go:117] "RemoveContainer" containerID="d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63" Apr 24 19:25:17.327269 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:25:17.327251 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63\": container with ID starting with d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63 not found: ID does not exist" containerID="d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63" Apr 24 19:25:17.327321 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.327277 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63"} err="failed to get container status \"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63\": rpc error: code = NotFound desc = could not find container \"d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63\": container with ID starting with d16ba28f3296c59f761ee4d4f75b5aba17d28303fe8f7abf1fa72c7e6dda9d63 not found: ID does not exist" Apr 24 19:25:17.334252 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.334225 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:25:17.339463 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:17.339439 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-65m2r"] Apr 24 19:25:19.186453 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:25:19.186416 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05101861-d839-47ed-a635-7210ff57c7a4" path="/var/lib/kubelet/pods/05101861-d839-47ed-a635-7210ff57c7a4/volumes" Apr 24 19:27:30.713632 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:30.713590 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerStarted","Data":"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45"} Apr 24 19:27:30.713632 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:30.713635 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerStarted","Data":"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7"} Apr 24 19:27:30.714137 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:30.713726 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:27:30.740584 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:30.740506 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" podStartSLOduration=6.159103406 podStartE2EDuration="2m18.740487021s" podCreationTimestamp="2026-04-24 19:25:12 +0000 UTC" firstStartedPulling="2026-04-24 19:25:17.302081963 +0000 UTC m=+1110.626326106" lastFinishedPulling="2026-04-24 19:27:29.883465577 +0000 UTC m=+1243.207709721" observedRunningTime="2026-04-24 19:27:30.73859515 +0000 UTC m=+1244.062839313" watchObservedRunningTime="2026-04-24 19:27:30.740487021 +0000 UTC m=+1244.064731191" Apr 24 19:27:31.717863 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:31.717833 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:27:37.727001 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:27:37.726922 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:28:07.730617 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:07.730585 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:28:12.173440 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.173405 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:28:12.173952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.173737 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kserve-container" containerID="cri-o://6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7" gracePeriod=30 Apr 24 19:28:12.173952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.173787 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kube-rbac-proxy" containerID="cri-o://d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45" gracePeriod=30 Apr 24 19:28:12.292416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292380 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:12.292828 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292805 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" Apr 24 19:28:12.292876 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292831 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" Apr 24 19:28:12.292876 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292852 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kube-rbac-proxy" Apr 24 19:28:12.292876 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292859 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kube-rbac-proxy" Apr 24 19:28:12.292876 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292869 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="storage-initializer" Apr 24 19:28:12.293000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292877 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="storage-initializer" Apr 24 19:28:12.293000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292956 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kserve-container" Apr 24 19:28:12.293000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.292966 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="05101861-d839-47ed-a635-7210ff57c7a4" containerName="kube-rbac-proxy" Apr 24 19:28:12.296313 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.296293 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.298660 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.298639 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 19:28:12.298660 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.298650 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 24 19:28:12.305879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.305847 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:12.409027 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.408985 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.409229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.409043 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.409229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.409073 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvlbg\" (UniqueName: \"kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.409229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.409104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.510061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.510061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510066 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lvlbg\" (UniqueName: \"kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.510344 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510100 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.510344 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510179 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.510344 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:12.510267 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 24 19:28:12.510344 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:12.510325 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls podName:fb17873b-6731-405e-b780-475217d47fa8 nodeName:}" failed. No retries permitted until 2026-04-24 19:28:13.010305274 +0000 UTC m=+1286.334549428 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" (UID: "fb17873b-6731-405e-b780-475217d47fa8") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 24 19:28:12.510575 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510483 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.511000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.510974 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.519787 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.519755 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvlbg\" (UniqueName: \"kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:12.723071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.723029 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.27:8643/healthz\": dial tcp 10.132.0.27:8643: connect: connection refused" Apr 24 19:28:12.837800 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.837712 2564 generic.go:358] "Generic (PLEG): container finished" podID="160233ed-3114-4e24-9fae-a0ca826d8676" containerID="d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45" exitCode=2 Apr 24 19:28:12.837948 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:12.837788 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerDied","Data":"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45"} Apr 24 19:28:13.016278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.016240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:13.018741 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.018710 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:13.206770 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.206737 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:13.325063 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.325040 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:28:13.336385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.336359 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:13.339089 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:28:13.339059 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb17873b_6731_405e_b780_475217d47fa8.slice/crio-223ab468f898397756b4b53eab4f8f8cdcd4afa90c7d938090e29efb66a35e0e WatchSource:0}: Error finding container 223ab468f898397756b4b53eab4f8f8cdcd4afa90c7d938090e29efb66a35e0e: Status 404 returned error can't find the container with id 223ab468f898397756b4b53eab4f8f8cdcd4afa90c7d938090e29efb66a35e0e Apr 24 19:28:13.420614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.420584 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"160233ed-3114-4e24-9fae-a0ca826d8676\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " Apr 24 19:28:13.420793 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.420633 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls\") pod \"160233ed-3114-4e24-9fae-a0ca826d8676\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " Apr 24 19:28:13.420793 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.420669 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location\") pod \"160233ed-3114-4e24-9fae-a0ca826d8676\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " Apr 24 19:28:13.420793 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.420718 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4x44\" (UniqueName: \"kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44\") pod \"160233ed-3114-4e24-9fae-a0ca826d8676\" (UID: \"160233ed-3114-4e24-9fae-a0ca826d8676\") " Apr 24 19:28:13.421014 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.420985 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "160233ed-3114-4e24-9fae-a0ca826d8676" (UID: "160233ed-3114-4e24-9fae-a0ca826d8676"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:28:13.421112 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.421087 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "160233ed-3114-4e24-9fae-a0ca826d8676" (UID: "160233ed-3114-4e24-9fae-a0ca826d8676"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:28:13.422728 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.422696 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "160233ed-3114-4e24-9fae-a0ca826d8676" (UID: "160233ed-3114-4e24-9fae-a0ca826d8676"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:28:13.422834 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.422806 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44" (OuterVolumeSpecName: "kube-api-access-l4x44") pod "160233ed-3114-4e24-9fae-a0ca826d8676" (UID: "160233ed-3114-4e24-9fae-a0ca826d8676"). InnerVolumeSpecName "kube-api-access-l4x44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:28:13.521422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.521388 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l4x44\" (UniqueName: \"kubernetes.io/projected/160233ed-3114-4e24-9fae-a0ca826d8676-kube-api-access-l4x44\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:13.521422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.521418 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/160233ed-3114-4e24-9fae-a0ca826d8676-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:13.521422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.521430 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/160233ed-3114-4e24-9fae-a0ca826d8676-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:13.521659 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.521440 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/160233ed-3114-4e24-9fae-a0ca826d8676-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:13.847688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.847653 2564 generic.go:358] "Generic (PLEG): container finished" podID="160233ed-3114-4e24-9fae-a0ca826d8676" containerID="6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7" exitCode=0 Apr 24 19:28:13.847868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.847720 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerDied","Data":"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7"} Apr 24 19:28:13.847868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.847757 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" event={"ID":"160233ed-3114-4e24-9fae-a0ca826d8676","Type":"ContainerDied","Data":"41614bce035d7064c5e3972dd98f887a92091230592fee64a942cf863ddaec5f"} Apr 24 19:28:13.847868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.847775 2564 scope.go:117] "RemoveContainer" containerID="d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45" Apr 24 19:28:13.847868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.847786 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7" Apr 24 19:28:13.849200 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.849174 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerStarted","Data":"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa"} Apr 24 19:28:13.849293 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.849208 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerStarted","Data":"223ab468f898397756b4b53eab4f8f8cdcd4afa90c7d938090e29efb66a35e0e"} Apr 24 19:28:13.855460 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.855441 2564 scope.go:117] "RemoveContainer" containerID="6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7" Apr 24 19:28:13.862776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.862759 2564 scope.go:117] "RemoveContainer" containerID="856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a" Apr 24 19:28:13.870162 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.870144 2564 scope.go:117] "RemoveContainer" containerID="d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45" Apr 24 19:28:13.870491 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:13.870463 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45\": container with ID starting with d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45 not found: ID does not exist" containerID="d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45" Apr 24 19:28:13.870546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.870506 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45"} err="failed to get container status \"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45\": rpc error: code = NotFound desc = could not find container \"d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45\": container with ID starting with d646e4c28cb5a545c69b0bbfd5537223dc09bb92096d4833f6cd898bb160ab45 not found: ID does not exist" Apr 24 19:28:13.870546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.870533 2564 scope.go:117] "RemoveContainer" containerID="6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7" Apr 24 19:28:13.870876 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:13.870855 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7\": container with ID starting with 6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7 not found: ID does not exist" containerID="6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7" Apr 24 19:28:13.870957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.870883 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7"} err="failed to get container status \"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7\": rpc error: code = NotFound desc = could not find container \"6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7\": container with ID starting with 6a42ca793503e554c58be844fe910f37105c12e3780536c35c0ed77494e30ea7 not found: ID does not exist" Apr 24 19:28:13.870957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.870901 2564 scope.go:117] "RemoveContainer" containerID="856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a" Apr 24 19:28:13.871190 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:13.871165 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a\": container with ID starting with 856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a not found: ID does not exist" containerID="856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a" Apr 24 19:28:13.871254 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.871196 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a"} err="failed to get container status \"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a\": rpc error: code = NotFound desc = could not find container \"856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a\": container with ID starting with 856bdde2b90650b4b4030512c4147d138661ede25118d5d3def317870e661c3a not found: ID does not exist" Apr 24 19:28:13.882768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.882735 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:28:13.885115 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:13.885090 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-dfch7"] Apr 24 19:28:15.184294 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:15.184261 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" path="/var/lib/kubelet/pods/160233ed-3114-4e24-9fae-a0ca826d8676/volumes" Apr 24 19:28:17.863594 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:17.863543 2564 generic.go:358] "Generic (PLEG): container finished" podID="fb17873b-6731-405e-b780-475217d47fa8" containerID="744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa" exitCode=0 Apr 24 19:28:17.863969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:17.863618 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerDied","Data":"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa"} Apr 24 19:28:18.867934 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:18.867899 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerStarted","Data":"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee"} Apr 24 19:28:18.868310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:18.867941 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerStarted","Data":"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7"} Apr 24 19:28:18.868310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:18.868144 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:18.887663 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:18.887615 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" podStartSLOduration=6.887600162 podStartE2EDuration="6.887600162s" podCreationTimestamp="2026-04-24 19:28:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:28:18.885833935 +0000 UTC m=+1292.210078099" watchObservedRunningTime="2026-04-24 19:28:18.887600162 +0000 UTC m=+1292.211844324" Apr 24 19:28:19.870723 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:19.870689 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:19.871932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:19.871896 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 19:28:20.873453 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:20.873408 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 24 19:28:25.878322 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:25.878293 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:25.879390 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:25.879372 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:32.323772 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.323736 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:32.324256 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.324192 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" containerID="cri-o://78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7" gracePeriod=30 Apr 24 19:28:32.324358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.324317 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kube-rbac-proxy" containerID="cri-o://212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee" gracePeriod=30 Apr 24 19:28:32.394889 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.394859 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:28:32.395169 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395158 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kserve-container" Apr 24 19:28:32.395227 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395172 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kserve-container" Apr 24 19:28:32.395227 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395189 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kube-rbac-proxy" Apr 24 19:28:32.395227 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395195 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kube-rbac-proxy" Apr 24 19:28:32.395227 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395202 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="storage-initializer" Apr 24 19:28:32.395227 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395209 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="storage-initializer" Apr 24 19:28:32.395372 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395250 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kserve-container" Apr 24 19:28:32.395372 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.395259 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="160233ed-3114-4e24-9fae-a0ca826d8676" containerName="kube-rbac-proxy" Apr 24 19:28:32.398102 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.398082 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.400451 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.400430 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:28:32.400590 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.400448 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 24 19:28:32.408046 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.408024 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:28:32.444791 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.444760 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.444903 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.444839 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.444903 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.444883 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldffz\" (UniqueName: \"kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.444984 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.444921 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.545948 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.545908 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.546121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.545956 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldffz\" (UniqueName: \"kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.546121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.545989 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.546121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.546025 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.546508 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.546476 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.546782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.546763 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.548539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.548513 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.554755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.554727 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldffz\" (UniqueName: \"kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.708536 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.708486 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:32.835086 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.835062 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:28:32.837752 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:28:32.837727 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81d7891b_13a2_43d5_816b_945df0c6bf51.slice/crio-f6edb5496a345e1e2bd4d0b486c950b0beb753a8f844ef99a33980febb2c82e4 WatchSource:0}: Error finding container f6edb5496a345e1e2bd4d0b486c950b0beb753a8f844ef99a33980febb2c82e4: Status 404 returned error can't find the container with id f6edb5496a345e1e2bd4d0b486c950b0beb753a8f844ef99a33980febb2c82e4 Apr 24 19:28:32.909974 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.909946 2564 generic.go:358] "Generic (PLEG): container finished" podID="fb17873b-6731-405e-b780-475217d47fa8" containerID="212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee" exitCode=2 Apr 24 19:28:32.910094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.910022 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerDied","Data":"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee"} Apr 24 19:28:32.911353 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.911326 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerStarted","Data":"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7"} Apr 24 19:28:32.911460 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:32.911362 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerStarted","Data":"f6edb5496a345e1e2bd4d0b486c950b0beb753a8f844ef99a33980febb2c82e4"} Apr 24 19:28:33.068690 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.068665 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:33.151448 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151416 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") pod \"fb17873b-6731-405e-b780-475217d47fa8\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " Apr 24 19:28:33.151663 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151480 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"fb17873b-6731-405e-b780-475217d47fa8\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " Apr 24 19:28:33.151663 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151516 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvlbg\" (UniqueName: \"kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg\") pod \"fb17873b-6731-405e-b780-475217d47fa8\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " Apr 24 19:28:33.151663 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151536 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location\") pod \"fb17873b-6731-405e-b780-475217d47fa8\" (UID: \"fb17873b-6731-405e-b780-475217d47fa8\") " Apr 24 19:28:33.151951 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151915 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "fb17873b-6731-405e-b780-475217d47fa8" (UID: "fb17873b-6731-405e-b780-475217d47fa8"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:28:33.152047 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.151966 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb17873b-6731-405e-b780-475217d47fa8" (UID: "fb17873b-6731-405e-b780-475217d47fa8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:28:33.153712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.153686 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fb17873b-6731-405e-b780-475217d47fa8" (UID: "fb17873b-6731-405e-b780-475217d47fa8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:28:33.153712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.153696 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg" (OuterVolumeSpecName: "kube-api-access-lvlbg") pod "fb17873b-6731-405e-b780-475217d47fa8" (UID: "fb17873b-6731-405e-b780-475217d47fa8"). InnerVolumeSpecName "kube-api-access-lvlbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:28:33.252337 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.252261 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fb17873b-6731-405e-b780-475217d47fa8-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:33.252337 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.252288 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fb17873b-6731-405e-b780-475217d47fa8-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:33.252337 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.252299 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lvlbg\" (UniqueName: \"kubernetes.io/projected/fb17873b-6731-405e-b780-475217d47fa8-kube-api-access-lvlbg\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:33.252337 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.252309 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb17873b-6731-405e-b780-475217d47fa8-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:28:33.915765 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.915724 2564 generic.go:358] "Generic (PLEG): container finished" podID="fb17873b-6731-405e-b780-475217d47fa8" containerID="78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7" exitCode=0 Apr 24 19:28:33.916210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.915776 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerDied","Data":"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7"} Apr 24 19:28:33.916210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.915823 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" Apr 24 19:28:33.916210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.915841 2564 scope.go:117] "RemoveContainer" containerID="212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee" Apr 24 19:28:33.916210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.915827 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv" event={"ID":"fb17873b-6731-405e-b780-475217d47fa8","Type":"ContainerDied","Data":"223ab468f898397756b4b53eab4f8f8cdcd4afa90c7d938090e29efb66a35e0e"} Apr 24 19:28:33.923479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.923460 2564 scope.go:117] "RemoveContainer" containerID="78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7" Apr 24 19:28:33.930063 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.930041 2564 scope.go:117] "RemoveContainer" containerID="744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa" Apr 24 19:28:33.935322 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.935300 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:33.937435 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.937412 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-wwzvv"] Apr 24 19:28:33.937987 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.937968 2564 scope.go:117] "RemoveContainer" containerID="212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee" Apr 24 19:28:33.938252 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:33.938235 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee\": container with ID starting with 212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee not found: ID does not exist" containerID="212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee" Apr 24 19:28:33.938297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.938267 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee"} err="failed to get container status \"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee\": rpc error: code = NotFound desc = could not find container \"212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee\": container with ID starting with 212d757bb593d6b07027f8b9c7b1d4f7e2dd639d883de0ad37ce351428065fee not found: ID does not exist" Apr 24 19:28:33.938297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.938285 2564 scope.go:117] "RemoveContainer" containerID="78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7" Apr 24 19:28:33.938513 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:33.938495 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7\": container with ID starting with 78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7 not found: ID does not exist" containerID="78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7" Apr 24 19:28:33.938565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.938520 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7"} err="failed to get container status \"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7\": rpc error: code = NotFound desc = could not find container \"78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7\": container with ID starting with 78c57aad375c0a03ecc370c186c44b8a7463cdf5e987eb3464ea32b3b6df76c7 not found: ID does not exist" Apr 24 19:28:33.938565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.938537 2564 scope.go:117] "RemoveContainer" containerID="744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa" Apr 24 19:28:33.938785 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:28:33.938767 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa\": container with ID starting with 744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa not found: ID does not exist" containerID="744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa" Apr 24 19:28:33.938839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:33.938793 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa"} err="failed to get container status \"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa\": rpc error: code = NotFound desc = could not find container \"744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa\": container with ID starting with 744d01ddb7555d150d202f8fea1174051d7e067f49706beafd7c9d4246caeaaa not found: ID does not exist" Apr 24 19:28:35.183878 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:35.183845 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb17873b-6731-405e-b780-475217d47fa8" path="/var/lib/kubelet/pods/fb17873b-6731-405e-b780-475217d47fa8/volumes" Apr 24 19:28:37.930714 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:37.930678 2564 generic.go:358] "Generic (PLEG): container finished" podID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerID="7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7" exitCode=0 Apr 24 19:28:37.931091 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:37.930755 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerDied","Data":"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7"} Apr 24 19:28:38.936212 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:38.936177 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerStarted","Data":"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259"} Apr 24 19:28:38.936212 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:38.936218 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerStarted","Data":"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad"} Apr 24 19:28:38.936662 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:38.936419 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:38.956133 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:38.956083 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" podStartSLOduration=6.956069875 podStartE2EDuration="6.956069875s" podCreationTimestamp="2026-04-24 19:28:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:28:38.954301749 +0000 UTC m=+1312.278545937" watchObservedRunningTime="2026-04-24 19:28:38.956069875 +0000 UTC m=+1312.280314038" Apr 24 19:28:39.939324 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:39.939289 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:28:45.947529 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:28:45.947497 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:29:15.951020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:15.950947 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:29:22.490414 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.490384 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:29:22.491186 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.490742 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kserve-container" containerID="cri-o://17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad" gracePeriod=30 Apr 24 19:29:22.491186 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.490799 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kube-rbac-proxy" containerID="cri-o://219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259" gracePeriod=30 Apr 24 19:29:22.609606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.609573 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:29:22.610031 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610007 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" Apr 24 19:29:22.610031 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610029 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" Apr 24 19:29:22.610194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610045 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="storage-initializer" Apr 24 19:29:22.610194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610053 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="storage-initializer" Apr 24 19:29:22.610194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610088 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kube-rbac-proxy" Apr 24 19:29:22.610194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610097 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kube-rbac-proxy" Apr 24 19:29:22.610194 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610186 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kube-rbac-proxy" Apr 24 19:29:22.610362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.610212 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb17873b-6731-405e-b780-475217d47fa8" containerName="kserve-container" Apr 24 19:29:22.613858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.613838 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.616306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.616285 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 24 19:29:22.616421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.616347 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 24 19:29:22.623929 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.623905 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:29:22.730912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.730855 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.730912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.730924 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tk45\" (UniqueName: \"kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.731142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.730980 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.731142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.731001 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.832304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.832210 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.832304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.832258 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.832304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.832297 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.832604 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.832323 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5tk45\" (UniqueName: \"kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.832826 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.832802 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.833153 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.833127 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.834845 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.834824 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.840728 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.840704 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tk45\" (UniqueName: \"kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45\") pod \"isvc-sklearn-mcp-predictor-b9d4994b5-7dg42\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:22.924715 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:22.924670 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:23.052518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.052485 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:29:23.054964 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:29:23.054933 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3b80bb1_368d_4a41_a91d_216101e84e6c.slice/crio-3ca3b9613ca7a47988b0301813fb5830ed0a26ba9cfd0592e1006c1e11cf60fe WatchSource:0}: Error finding container 3ca3b9613ca7a47988b0301813fb5830ed0a26ba9cfd0592e1006c1e11cf60fe: Status 404 returned error can't find the container with id 3ca3b9613ca7a47988b0301813fb5830ed0a26ba9cfd0592e1006c1e11cf60fe Apr 24 19:29:23.059171 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.059148 2564 generic.go:358] "Generic (PLEG): container finished" podID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerID="219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259" exitCode=2 Apr 24 19:29:23.059256 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.059216 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerDied","Data":"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259"} Apr 24 19:29:23.826158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.826136 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:29:23.942909 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.942813 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location\") pod \"81d7891b-13a2-43d5-816b-945df0c6bf51\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " Apr 24 19:29:23.942909 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.942887 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls\") pod \"81d7891b-13a2-43d5-816b-945df0c6bf51\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " Apr 24 19:29:23.943137 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.942914 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldffz\" (UniqueName: \"kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz\") pod \"81d7891b-13a2-43d5-816b-945df0c6bf51\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " Apr 24 19:29:23.943137 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.942959 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"81d7891b-13a2-43d5-816b-945df0c6bf51\" (UID: \"81d7891b-13a2-43d5-816b-945df0c6bf51\") " Apr 24 19:29:23.943365 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.943326 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "81d7891b-13a2-43d5-816b-945df0c6bf51" (UID: "81d7891b-13a2-43d5-816b-945df0c6bf51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:29:23.943365 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.943353 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "81d7891b-13a2-43d5-816b-945df0c6bf51" (UID: "81d7891b-13a2-43d5-816b-945df0c6bf51"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:29:23.945060 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.945039 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "81d7891b-13a2-43d5-816b-945df0c6bf51" (UID: "81d7891b-13a2-43d5-816b-945df0c6bf51"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:29:23.945060 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:23.945035 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz" (OuterVolumeSpecName: "kube-api-access-ldffz") pod "81d7891b-13a2-43d5-816b-945df0c6bf51" (UID: "81d7891b-13a2-43d5-816b-945df0c6bf51"). InnerVolumeSpecName "kube-api-access-ldffz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:29:24.044080 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.044047 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/81d7891b-13a2-43d5-816b-945df0c6bf51-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:29:24.044080 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.044076 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81d7891b-13a2-43d5-816b-945df0c6bf51-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:29:24.044080 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.044086 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/81d7891b-13a2-43d5-816b-945df0c6bf51-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:29:24.044306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.044097 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldffz\" (UniqueName: \"kubernetes.io/projected/81d7891b-13a2-43d5-816b-945df0c6bf51-kube-api-access-ldffz\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:29:24.063856 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.063822 2564 generic.go:358] "Generic (PLEG): container finished" podID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerID="17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad" exitCode=0 Apr 24 19:29:24.064039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.063894 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerDied","Data":"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad"} Apr 24 19:29:24.064039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.063921 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" event={"ID":"81d7891b-13a2-43d5-816b-945df0c6bf51","Type":"ContainerDied","Data":"f6edb5496a345e1e2bd4d0b486c950b0beb753a8f844ef99a33980febb2c82e4"} Apr 24 19:29:24.064039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.063922 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q" Apr 24 19:29:24.064039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.063939 2564 scope.go:117] "RemoveContainer" containerID="219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259" Apr 24 19:29:24.065218 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.065188 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerStarted","Data":"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5"} Apr 24 19:29:24.065323 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.065233 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerStarted","Data":"3ca3b9613ca7a47988b0301813fb5830ed0a26ba9cfd0592e1006c1e11cf60fe"} Apr 24 19:29:24.072177 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.072160 2564 scope.go:117] "RemoveContainer" containerID="17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad" Apr 24 19:29:24.079247 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.079226 2564 scope.go:117] "RemoveContainer" containerID="7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7" Apr 24 19:29:24.085707 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.085689 2564 scope.go:117] "RemoveContainer" containerID="219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259" Apr 24 19:29:24.085957 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:29:24.085936 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259\": container with ID starting with 219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259 not found: ID does not exist" containerID="219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259" Apr 24 19:29:24.086009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.085966 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259"} err="failed to get container status \"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259\": rpc error: code = NotFound desc = could not find container \"219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259\": container with ID starting with 219e83c8f35547f79fde054786a0dbab3226d866a64c11f240e6a29c9d76d259 not found: ID does not exist" Apr 24 19:29:24.086009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.085984 2564 scope.go:117] "RemoveContainer" containerID="17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad" Apr 24 19:29:24.086213 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:29:24.086195 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad\": container with ID starting with 17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad not found: ID does not exist" containerID="17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad" Apr 24 19:29:24.086276 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.086223 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad"} err="failed to get container status \"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad\": rpc error: code = NotFound desc = could not find container \"17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad\": container with ID starting with 17d36a6aa43c64c816f1d7d2bc50a13f258be27b9df459b6bf5f7932397cbbad not found: ID does not exist" Apr 24 19:29:24.086276 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.086247 2564 scope.go:117] "RemoveContainer" containerID="7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7" Apr 24 19:29:24.086515 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:29:24.086497 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7\": container with ID starting with 7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7 not found: ID does not exist" containerID="7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7" Apr 24 19:29:24.086573 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.086520 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7"} err="failed to get container status \"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7\": rpc error: code = NotFound desc = could not find container \"7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7\": container with ID starting with 7d495f153e337af9884838aee2e5cfcc9ee2357766f6989c44eb9470480d81c7 not found: ID does not exist" Apr 24 19:29:24.108582 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.108538 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:29:24.116475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:24.116445 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-bgz7q"] Apr 24 19:29:25.184842 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:25.184808 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" path="/var/lib/kubelet/pods/81d7891b-13a2-43d5-816b-945df0c6bf51/volumes" Apr 24 19:29:27.075537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:27.075457 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerID="b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5" exitCode=0 Apr 24 19:29:27.075893 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:27.075529 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerDied","Data":"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5"} Apr 24 19:29:28.080025 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:28.079990 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerStarted","Data":"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647"} Apr 24 19:29:31.090243 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:31.090204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerStarted","Data":"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8"} Apr 24 19:29:31.090243 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:31.090243 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerStarted","Data":"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3"} Apr 24 19:29:31.090703 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:31.090418 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:31.112138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:31.112086 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podStartSLOduration=6.069487502 podStartE2EDuration="9.112073425s" podCreationTimestamp="2026-04-24 19:29:22 +0000 UTC" firstStartedPulling="2026-04-24 19:29:27.132491375 +0000 UTC m=+1360.456735520" lastFinishedPulling="2026-04-24 19:29:30.175077298 +0000 UTC m=+1363.499321443" observedRunningTime="2026-04-24 19:29:31.110878596 +0000 UTC m=+1364.435122758" watchObservedRunningTime="2026-04-24 19:29:31.112073425 +0000 UTC m=+1364.436317588" Apr 24 19:29:32.093677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:32.093644 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:32.093677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:32.093682 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:38.102187 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:38.102158 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:29:58.102995 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:29:58.102953 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 19:30:08.104352 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:08.104318 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:30:38.105566 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:38.105477 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:30:42.649127 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.649093 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:30:42.649628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.649444 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" containerID="cri-o://9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647" gracePeriod=30 Apr 24 19:30:42.649628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.649494 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" containerID="cri-o://6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8" gracePeriod=30 Apr 24 19:30:42.649628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.649595 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-agent" containerID="cri-o://3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3" gracePeriod=30 Apr 24 19:30:42.730717 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.730675 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:30:42.731439 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731406 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="storage-initializer" Apr 24 19:30:42.731439 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731438 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="storage-initializer" Apr 24 19:30:42.731667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731452 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kserve-container" Apr 24 19:30:42.731667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731462 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kserve-container" Apr 24 19:30:42.731667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731496 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kube-rbac-proxy" Apr 24 19:30:42.731667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731507 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kube-rbac-proxy" Apr 24 19:30:42.731858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731759 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kserve-container" Apr 24 19:30:42.731858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.731775 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="81d7891b-13a2-43d5-816b-945df0c6bf51" containerName="kube-rbac-proxy" Apr 24 19:30:42.737511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.737483 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.740109 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.740084 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 24 19:30:42.740223 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.740084 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 24 19:30:42.746968 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.746944 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:30:42.850478 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.850438 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.850478 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.850482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.850705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.850505 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.850705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.850524 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw95s\" (UniqueName: \"kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951177 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951340 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jw95s\" (UniqueName: \"kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951340 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951246 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951340 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951270 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951491 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:30:42.951383 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-predictor-serving-cert: secret "isvc-paddle-predictor-serving-cert" not found Apr 24 19:30:42.951491 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:30:42.951473 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls podName:869cebca-b23e-4b03-907c-0feb685b856f nodeName:}" failed. No retries permitted until 2026-04-24 19:30:43.451454362 +0000 UTC m=+1436.775698517 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls") pod "isvc-paddle-predictor-6b8b7cfb4b-h7nw6" (UID: "869cebca-b23e-4b03-907c-0feb685b856f") : secret "isvc-paddle-predictor-serving-cert" not found Apr 24 19:30:42.951649 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.951920 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.951904 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:42.959932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:42.959909 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw95s\" (UniqueName: \"kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:43.097861 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.097812 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:30:43.291399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.291314 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerID="6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8" exitCode=2 Apr 24 19:30:43.291536 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.291390 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerDied","Data":"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8"} Apr 24 19:30:43.455375 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.455334 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:43.457912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.457888 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-h7nw6\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:43.648626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.648497 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:30:43.776697 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.776666 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:30:43.779851 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:30:43.779823 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869cebca_b23e_4b03_907c_0feb685b856f.slice/crio-2f54e9d24d3ca949f786f73356a507a60122bc54dbbe9d57b07a04499dfa4750 WatchSource:0}: Error finding container 2f54e9d24d3ca949f786f73356a507a60122bc54dbbe9d57b07a04499dfa4750: Status 404 returned error can't find the container with id 2f54e9d24d3ca949f786f73356a507a60122bc54dbbe9d57b07a04499dfa4750 Apr 24 19:30:43.781730 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:43.781712 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:30:44.295391 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:44.295356 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerStarted","Data":"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881"} Apr 24 19:30:44.295391 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:44.295399 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerStarted","Data":"2f54e9d24d3ca949f786f73356a507a60122bc54dbbe9d57b07a04499dfa4750"} Apr 24 19:30:45.300616 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:45.300576 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerID="9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647" exitCode=0 Apr 24 19:30:45.300987 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:45.300645 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerDied","Data":"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647"} Apr 24 19:30:48.097425 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:48.097374 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:30:48.103441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:48.103405 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 19:30:48.310130 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:48.310095 2564 generic.go:358] "Generic (PLEG): container finished" podID="869cebca-b23e-4b03-907c-0feb685b856f" containerID="5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881" exitCode=0 Apr 24 19:30:48.310295 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:48.310136 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerDied","Data":"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881"} Apr 24 19:30:53.097655 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:53.097597 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:30:53.098104 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:53.097769 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:30:58.096992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:58.096936 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:30:58.103568 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:30:58.103518 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 19:31:00.350735 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:00.350698 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerStarted","Data":"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823"} Apr 24 19:31:00.350735 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:00.350743 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerStarted","Data":"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7"} Apr 24 19:31:00.351206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:00.350948 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:00.369347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:00.369296 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podStartSLOduration=6.795504425 podStartE2EDuration="18.369277818s" podCreationTimestamp="2026-04-24 19:30:42 +0000 UTC" firstStartedPulling="2026-04-24 19:30:48.311329109 +0000 UTC m=+1441.635573251" lastFinishedPulling="2026-04-24 19:30:59.885102495 +0000 UTC m=+1453.209346644" observedRunningTime="2026-04-24 19:31:00.368464951 +0000 UTC m=+1453.692709133" watchObservedRunningTime="2026-04-24 19:31:00.369277818 +0000 UTC m=+1453.693521984" Apr 24 19:31:01.353291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:01.353253 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:01.354402 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:01.354373 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:02.356071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:02.356034 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:03.097799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:03.097754 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:31:07.360979 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:07.360946 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:07.361568 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:07.361521 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:08.097335 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:08.097287 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 24 19:31:08.103443 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:08.103409 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 24 19:31:08.103543 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:08.103526 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:31:12.859065 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.859040 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:31:12.888539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888511 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls\") pod \"a3b80bb1-368d-4a41-a91d-216101e84e6c\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " Apr 24 19:31:12.888726 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888591 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"a3b80bb1-368d-4a41-a91d-216101e84e6c\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " Apr 24 19:31:12.888726 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888613 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tk45\" (UniqueName: \"kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45\") pod \"a3b80bb1-368d-4a41-a91d-216101e84e6c\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " Apr 24 19:31:12.888726 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888647 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location\") pod \"a3b80bb1-368d-4a41-a91d-216101e84e6c\" (UID: \"a3b80bb1-368d-4a41-a91d-216101e84e6c\") " Apr 24 19:31:12.888992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888958 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3b80bb1-368d-4a41-a91d-216101e84e6c" (UID: "a3b80bb1-368d-4a41-a91d-216101e84e6c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:31:12.889039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.888984 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "a3b80bb1-368d-4a41-a91d-216101e84e6c" (UID: "a3b80bb1-368d-4a41-a91d-216101e84e6c"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:31:12.890800 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.890774 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45" (OuterVolumeSpecName: "kube-api-access-5tk45") pod "a3b80bb1-368d-4a41-a91d-216101e84e6c" (UID: "a3b80bb1-368d-4a41-a91d-216101e84e6c"). InnerVolumeSpecName "kube-api-access-5tk45". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:31:12.890907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.890838 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a3b80bb1-368d-4a41-a91d-216101e84e6c" (UID: "a3b80bb1-368d-4a41-a91d-216101e84e6c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:31:12.989819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.989781 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a3b80bb1-368d-4a41-a91d-216101e84e6c-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:12.989819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.989813 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a3b80bb1-368d-4a41-a91d-216101e84e6c-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:12.989819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.989824 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5tk45\" (UniqueName: \"kubernetes.io/projected/a3b80bb1-368d-4a41-a91d-216101e84e6c-kube-api-access-5tk45\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:12.990064 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:12.989834 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3b80bb1-368d-4a41-a91d-216101e84e6c-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:13.389366 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.389271 2564 generic.go:358] "Generic (PLEG): container finished" podID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerID="3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3" exitCode=137 Apr 24 19:31:13.389506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.389367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerDied","Data":"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3"} Apr 24 19:31:13.389506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.389372 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" Apr 24 19:31:13.389506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.389398 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42" event={"ID":"a3b80bb1-368d-4a41-a91d-216101e84e6c","Type":"ContainerDied","Data":"3ca3b9613ca7a47988b0301813fb5830ed0a26ba9cfd0592e1006c1e11cf60fe"} Apr 24 19:31:13.389506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.389418 2564 scope.go:117] "RemoveContainer" containerID="6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8" Apr 24 19:31:13.397238 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.397216 2564 scope.go:117] "RemoveContainer" containerID="3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3" Apr 24 19:31:13.404382 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.404362 2564 scope.go:117] "RemoveContainer" containerID="9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647" Apr 24 19:31:13.408024 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.407987 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:31:13.411990 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.411967 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-b9d4994b5-7dg42"] Apr 24 19:31:13.412184 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.412167 2564 scope.go:117] "RemoveContainer" containerID="b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5" Apr 24 19:31:13.418787 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.418757 2564 scope.go:117] "RemoveContainer" containerID="6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8" Apr 24 19:31:13.419032 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:13.419013 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8\": container with ID starting with 6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8 not found: ID does not exist" containerID="6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8" Apr 24 19:31:13.419090 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419040 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8"} err="failed to get container status \"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8\": rpc error: code = NotFound desc = could not find container \"6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8\": container with ID starting with 6cec8dc11315510e0115355ab2d528a1d35f42c7f17b5aa66feec74de19de4d8 not found: ID does not exist" Apr 24 19:31:13.419090 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419062 2564 scope.go:117] "RemoveContainer" containerID="3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3" Apr 24 19:31:13.419280 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:13.419261 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3\": container with ID starting with 3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3 not found: ID does not exist" containerID="3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3" Apr 24 19:31:13.419342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419283 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3"} err="failed to get container status \"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3\": rpc error: code = NotFound desc = could not find container \"3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3\": container with ID starting with 3da98f7d942a40a089cd0c8cd8b6dd36e3aced9d43b899ba6b2636e903b2a4d3 not found: ID does not exist" Apr 24 19:31:13.419342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419298 2564 scope.go:117] "RemoveContainer" containerID="9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647" Apr 24 19:31:13.419508 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:13.419488 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647\": container with ID starting with 9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647 not found: ID does not exist" containerID="9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647" Apr 24 19:31:13.419594 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419514 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647"} err="failed to get container status \"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647\": rpc error: code = NotFound desc = could not find container \"9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647\": container with ID starting with 9e67c9471c59a6677aebf638075c8f3ee4b8ec5fea791b8fb54b3550a6108647 not found: ID does not exist" Apr 24 19:31:13.419594 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419530 2564 scope.go:117] "RemoveContainer" containerID="b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5" Apr 24 19:31:13.419845 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:13.419829 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5\": container with ID starting with b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5 not found: ID does not exist" containerID="b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5" Apr 24 19:31:13.419904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:13.419848 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5"} err="failed to get container status \"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5\": rpc error: code = NotFound desc = could not find container \"b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5\": container with ID starting with b13c5ad2530b7ebb0f197cfc53166418ded47d25bbfed3a511bb582875b024c5 not found: ID does not exist" Apr 24 19:31:15.183686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:15.183653 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" path="/var/lib/kubelet/pods/a3b80bb1-368d-4a41-a91d-216101e84e6c/volumes" Apr 24 19:31:17.361599 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:17.361540 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:27.361497 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:27.361456 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:37.362269 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:37.362224 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 24 19:31:47.362704 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:47.362676 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:54.282120 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.282076 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:31:54.282645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.282511 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" containerID="cri-o://99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7" gracePeriod=30 Apr 24 19:31:54.282645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.282524 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kube-rbac-proxy" containerID="cri-o://8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823" gracePeriod=30 Apr 24 19:31:54.467949 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.467913 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:31:54.468197 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468185 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468197 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468216 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468223 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468231 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="storage-initializer" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468237 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="storage-initializer" Apr 24 19:31:54.468241 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468243 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-agent" Apr 24 19:31:54.468416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468249 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-agent" Apr 24 19:31:54.468416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468298 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-agent" Apr 24 19:31:54.468416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468304 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kserve-container" Apr 24 19:31:54.468416 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.468311 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3b80bb1-368d-4a41-a91d-216101e84e6c" containerName="kube-rbac-proxy" Apr 24 19:31:54.471187 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.471171 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.479026 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.478997 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 24 19:31:54.479672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.479652 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:31:54.499164 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.499134 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:31:54.506979 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.506950 2564 generic.go:358] "Generic (PLEG): container finished" podID="869cebca-b23e-4b03-907c-0feb685b856f" containerID="8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823" exitCode=2 Apr 24 19:31:54.507107 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.506991 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerDied","Data":"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823"} Apr 24 19:31:54.620618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.620503 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.620785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.620641 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.620785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.620708 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.620785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.620738 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbkb5\" (UniqueName: \"kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.721533 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.721495 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.721699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.721587 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.721699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.721626 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.721699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.721661 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rbkb5\" (UniqueName: \"kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.721854 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:54.721812 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-paddle-runtime-predictor-serving-cert: secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 24 19:31:54.721902 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:54.721891 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls podName:8bd825e5-e415-49a5-84a6-c4180dcdd073 nodeName:}" failed. No retries permitted until 2026-04-24 19:31:55.221868887 +0000 UTC m=+1508.546113038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls") pod "isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" (UID: "8bd825e5-e415-49a5-84a6-c4180dcdd073") : secret "isvc-paddle-runtime-predictor-serving-cert" not found Apr 24 19:31:54.722028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.722011 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.722306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.722289 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:54.740422 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:54.740388 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbkb5\" (UniqueName: \"kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:55.225649 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:55.225610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:55.227956 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:55.227922 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:55.380385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:55.380329 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:31:55.510803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:55.510729 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:31:55.513638 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:31:55.513606 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8bd825e5_e415_49a5_84a6_c4180dcdd073.slice/crio-4a02f453238cb48e0660e35f5a457e5bd9ef7b447c58ee48be78e2e41cdeff98 WatchSource:0}: Error finding container 4a02f453238cb48e0660e35f5a457e5bd9ef7b447c58ee48be78e2e41cdeff98: Status 404 returned error can't find the container with id 4a02f453238cb48e0660e35f5a457e5bd9ef7b447c58ee48be78e2e41cdeff98 Apr 24 19:31:56.514062 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:56.514029 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerStarted","Data":"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62"} Apr 24 19:31:56.514062 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:56.514063 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerStarted","Data":"4a02f453238cb48e0660e35f5a457e5bd9ef7b447c58ee48be78e2e41cdeff98"} Apr 24 19:31:57.122592 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.122567 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:57.244289 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.244258 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") pod \"869cebca-b23e-4b03-907c-0feb685b856f\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " Apr 24 19:31:57.244289 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.244306 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location\") pod \"869cebca-b23e-4b03-907c-0feb685b856f\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " Apr 24 19:31:57.244541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.244352 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw95s\" (UniqueName: \"kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s\") pod \"869cebca-b23e-4b03-907c-0feb685b856f\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " Apr 24 19:31:57.244541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.244406 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"869cebca-b23e-4b03-907c-0feb685b856f\" (UID: \"869cebca-b23e-4b03-907c-0feb685b856f\") " Apr 24 19:31:57.244806 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.244775 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "869cebca-b23e-4b03-907c-0feb685b856f" (UID: "869cebca-b23e-4b03-907c-0feb685b856f"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:31:57.246414 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.246383 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "869cebca-b23e-4b03-907c-0feb685b856f" (UID: "869cebca-b23e-4b03-907c-0feb685b856f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:31:57.246414 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.246401 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s" (OuterVolumeSpecName: "kube-api-access-jw95s") pod "869cebca-b23e-4b03-907c-0feb685b856f" (UID: "869cebca-b23e-4b03-907c-0feb685b856f"). InnerVolumeSpecName "kube-api-access-jw95s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:31:57.253854 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.253831 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "869cebca-b23e-4b03-907c-0feb685b856f" (UID: "869cebca-b23e-4b03-907c-0feb685b856f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:31:57.345574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.345515 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/869cebca-b23e-4b03-907c-0feb685b856f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:57.345574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.345570 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/869cebca-b23e-4b03-907c-0feb685b856f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:57.345774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.345585 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jw95s\" (UniqueName: \"kubernetes.io/projected/869cebca-b23e-4b03-907c-0feb685b856f-kube-api-access-jw95s\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:57.345774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.345600 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/869cebca-b23e-4b03-907c-0feb685b856f-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:31:57.518635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.518533 2564 generic.go:358] "Generic (PLEG): container finished" podID="869cebca-b23e-4b03-907c-0feb685b856f" containerID="99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7" exitCode=0 Apr 24 19:31:57.518635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.518623 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" Apr 24 19:31:57.519110 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.518614 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerDied","Data":"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7"} Apr 24 19:31:57.519110 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.518727 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6" event={"ID":"869cebca-b23e-4b03-907c-0feb685b856f","Type":"ContainerDied","Data":"2f54e9d24d3ca949f786f73356a507a60122bc54dbbe9d57b07a04499dfa4750"} Apr 24 19:31:57.519110 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.518743 2564 scope.go:117] "RemoveContainer" containerID="8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823" Apr 24 19:31:57.529684 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.529663 2564 scope.go:117] "RemoveContainer" containerID="99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7" Apr 24 19:31:57.536824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.536797 2564 scope.go:117] "RemoveContainer" containerID="5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881" Apr 24 19:31:57.542305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.542257 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:31:57.543831 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.543809 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-h7nw6"] Apr 24 19:31:57.544921 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.544898 2564 scope.go:117] "RemoveContainer" containerID="8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823" Apr 24 19:31:57.545294 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:57.545267 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823\": container with ID starting with 8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823 not found: ID does not exist" containerID="8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823" Apr 24 19:31:57.545345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.545304 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823"} err="failed to get container status \"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823\": rpc error: code = NotFound desc = could not find container \"8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823\": container with ID starting with 8414567ab5f74011b79937899cafc1f7a41033463112785b36467fab4c952823 not found: ID does not exist" Apr 24 19:31:57.545345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.545324 2564 scope.go:117] "RemoveContainer" containerID="99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7" Apr 24 19:31:57.545693 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:57.545670 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7\": container with ID starting with 99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7 not found: ID does not exist" containerID="99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7" Apr 24 19:31:57.545750 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.545702 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7"} err="failed to get container status \"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7\": rpc error: code = NotFound desc = could not find container \"99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7\": container with ID starting with 99486d1eb5acbb16b9223724c24bbd6cc8b5dd3608ca372c3ee93b25769487b7 not found: ID does not exist" Apr 24 19:31:57.545750 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.545726 2564 scope.go:117] "RemoveContainer" containerID="5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881" Apr 24 19:31:57.545969 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:31:57.545952 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881\": container with ID starting with 5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881 not found: ID does not exist" containerID="5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881" Apr 24 19:31:57.546020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:57.545973 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881"} err="failed to get container status \"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881\": rpc error: code = NotFound desc = could not find container \"5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881\": container with ID starting with 5969b06be51af2a5749b04124323672e34ef80de1b23309d76d9fc34a883a881 not found: ID does not exist" Apr 24 19:31:59.184635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:31:59.184600 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869cebca-b23e-4b03-907c-0feb685b856f" path="/var/lib/kubelet/pods/869cebca-b23e-4b03-907c-0feb685b856f/volumes" Apr 24 19:32:00.528117 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:00.528080 2564 generic.go:358] "Generic (PLEG): container finished" podID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerID="b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62" exitCode=0 Apr 24 19:32:00.528500 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:00.528128 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerDied","Data":"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62"} Apr 24 19:32:01.533774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.533742 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerStarted","Data":"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470"} Apr 24 19:32:01.533774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.533782 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerStarted","Data":"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e"} Apr 24 19:32:01.534243 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.534079 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:01.534243 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.534212 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:01.535546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.535515 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:01.553694 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:01.553640 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podStartSLOduration=7.553625854 podStartE2EDuration="7.553625854s" podCreationTimestamp="2026-04-24 19:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:32:01.552222456 +0000 UTC m=+1514.876466619" watchObservedRunningTime="2026-04-24 19:32:01.553625854 +0000 UTC m=+1514.877870016" Apr 24 19:32:02.536755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:02.536716 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:07.541300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:07.541226 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:07.541815 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:07.541789 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:17.542475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:17.542434 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:27.542161 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:27.542123 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:37.542601 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:37.542543 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:47.542768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:47.542678 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:55.673650 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.673613 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:32:55.674134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.673955 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" containerID="cri-o://46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e" gracePeriod=30 Apr 24 19:32:55.674134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.674007 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kube-rbac-proxy" containerID="cri-o://afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470" gracePeriod=30 Apr 24 19:32:55.763002 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.762973 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:32:55.763313 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763298 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="storage-initializer" Apr 24 19:32:55.763394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763316 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="storage-initializer" Apr 24 19:32:55.763394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763337 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" Apr 24 19:32:55.763394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763346 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" Apr 24 19:32:55.763394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763356 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kube-rbac-proxy" Apr 24 19:32:55.763394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763365 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kube-rbac-proxy" Apr 24 19:32:55.763666 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763459 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kserve-container" Apr 24 19:32:55.763666 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.763473 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="869cebca-b23e-4b03-907c-0feb685b856f" containerName="kube-rbac-proxy" Apr 24 19:32:55.767846 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.767823 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.770387 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.770361 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 19:32:55.770387 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.770383 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 24 19:32:55.775785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.775761 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:32:55.891966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.891933 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.892134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.891982 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.892134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.892059 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.892134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.892088 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjkng\" (UniqueName: \"kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.992962 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.992924 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.993168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.993018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.993168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.993065 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.993168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.993089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sjkng\" (UniqueName: \"kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.993433 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.993409 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.993776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.993755 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:55.995291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:55.995273 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:56.001121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.001095 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjkng\" (UniqueName: \"kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:56.079334 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.079293 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:32:56.198472 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.198447 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:32:56.200516 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:32:56.200487 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba4c06f_efab_4839_a284_228cc474b54f.slice/crio-fdf95fccbdc3318ebf77d69d1cf906322aaf29530d4157654c7a9caec1e6bf5c WatchSource:0}: Error finding container fdf95fccbdc3318ebf77d69d1cf906322aaf29530d4157654c7a9caec1e6bf5c: Status 404 returned error can't find the container with id fdf95fccbdc3318ebf77d69d1cf906322aaf29530d4157654c7a9caec1e6bf5c Apr 24 19:32:56.685917 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.685614 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerStarted","Data":"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6"} Apr 24 19:32:56.685917 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.685654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerStarted","Data":"fdf95fccbdc3318ebf77d69d1cf906322aaf29530d4157654c7a9caec1e6bf5c"} Apr 24 19:32:56.688210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.688183 2564 generic.go:358] "Generic (PLEG): container finished" podID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerID="afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470" exitCode=2 Apr 24 19:32:56.688341 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:56.688242 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerDied","Data":"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470"} Apr 24 19:32:57.537574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:57.537512 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 24 19:32:57.541831 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:57.541805 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 24 19:32:58.419409 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.419384 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:58.516224 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.516125 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location\") pod \"8bd825e5-e415-49a5-84a6-c4180dcdd073\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " Apr 24 19:32:58.516224 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.516171 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") pod \"8bd825e5-e415-49a5-84a6-c4180dcdd073\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " Apr 24 19:32:58.516434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.516296 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbkb5\" (UniqueName: \"kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5\") pod \"8bd825e5-e415-49a5-84a6-c4180dcdd073\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " Apr 24 19:32:58.516434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.516391 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"8bd825e5-e415-49a5-84a6-c4180dcdd073\" (UID: \"8bd825e5-e415-49a5-84a6-c4180dcdd073\") " Apr 24 19:32:58.516833 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.516793 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "8bd825e5-e415-49a5-84a6-c4180dcdd073" (UID: "8bd825e5-e415-49a5-84a6-c4180dcdd073"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:32:58.518418 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.518396 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5" (OuterVolumeSpecName: "kube-api-access-rbkb5") pod "8bd825e5-e415-49a5-84a6-c4180dcdd073" (UID: "8bd825e5-e415-49a5-84a6-c4180dcdd073"). InnerVolumeSpecName "kube-api-access-rbkb5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:32:58.518516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.518421 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8bd825e5-e415-49a5-84a6-c4180dcdd073" (UID: "8bd825e5-e415-49a5-84a6-c4180dcdd073"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:32:58.529508 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.529481 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8bd825e5-e415-49a5-84a6-c4180dcdd073" (UID: "8bd825e5-e415-49a5-84a6-c4180dcdd073"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:32:58.617363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.617322 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8bd825e5-e415-49a5-84a6-c4180dcdd073-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:32:58.617363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.617354 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8bd825e5-e415-49a5-84a6-c4180dcdd073-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:32:58.617363 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.617364 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8bd825e5-e415-49a5-84a6-c4180dcdd073-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:32:58.617639 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.617376 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rbkb5\" (UniqueName: \"kubernetes.io/projected/8bd825e5-e415-49a5-84a6-c4180dcdd073-kube-api-access-rbkb5\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:32:58.695082 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.695044 2564 generic.go:358] "Generic (PLEG): container finished" podID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerID="46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e" exitCode=0 Apr 24 19:32:58.695230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.695098 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerDied","Data":"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e"} Apr 24 19:32:58.695230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.695125 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" event={"ID":"8bd825e5-e415-49a5-84a6-c4180dcdd073","Type":"ContainerDied","Data":"4a02f453238cb48e0660e35f5a457e5bd9ef7b447c58ee48be78e2e41cdeff98"} Apr 24 19:32:58.695230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.695122 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5" Apr 24 19:32:58.695230 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.695135 2564 scope.go:117] "RemoveContainer" containerID="afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470" Apr 24 19:32:58.704196 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.704158 2564 scope.go:117] "RemoveContainer" containerID="46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e" Apr 24 19:32:58.712018 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.711999 2564 scope.go:117] "RemoveContainer" containerID="b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62" Apr 24 19:32:58.718596 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.718577 2564 scope.go:117] "RemoveContainer" containerID="afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470" Apr 24 19:32:58.718838 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:32:58.718818 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470\": container with ID starting with afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470 not found: ID does not exist" containerID="afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470" Apr 24 19:32:58.718901 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.718850 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470"} err="failed to get container status \"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470\": rpc error: code = NotFound desc = could not find container \"afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470\": container with ID starting with afe80ce37f66620270c2b349ad5396697ff842fec20aa6691d24af919f482470 not found: ID does not exist" Apr 24 19:32:58.718901 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.718875 2564 scope.go:117] "RemoveContainer" containerID="46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e" Apr 24 19:32:58.719146 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:32:58.719122 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e\": container with ID starting with 46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e not found: ID does not exist" containerID="46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e" Apr 24 19:32:58.719261 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.719148 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e"} err="failed to get container status \"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e\": rpc error: code = NotFound desc = could not find container \"46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e\": container with ID starting with 46d1d693879ca9bb277a52640425954c0ca17131e7b3bfcadd6a4c6b6014c83e not found: ID does not exist" Apr 24 19:32:58.719261 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.719172 2564 scope.go:117] "RemoveContainer" containerID="b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62" Apr 24 19:32:58.719618 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:32:58.719440 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62\": container with ID starting with b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62 not found: ID does not exist" containerID="b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62" Apr 24 19:32:58.719618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.719480 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62"} err="failed to get container status \"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62\": rpc error: code = NotFound desc = could not find container \"b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62\": container with ID starting with b5126b73b74302c3cef7860f805e9c4d0380def89caaf85bc2fe31c1d6a09d62 not found: ID does not exist" Apr 24 19:32:58.720832 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.720807 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:32:58.724483 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:58.724453 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-bd5d5"] Apr 24 19:32:59.184312 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:32:59.184279 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" path="/var/lib/kubelet/pods/8bd825e5-e415-49a5-84a6-c4180dcdd073/volumes" Apr 24 19:33:01.705389 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:01.705357 2564 generic.go:358] "Generic (PLEG): container finished" podID="5ba4c06f-efab-4839-a284-228cc474b54f" containerID="06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6" exitCode=0 Apr 24 19:33:01.705780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:01.705409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerDied","Data":"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6"} Apr 24 19:33:02.709671 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.709634 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerStarted","Data":"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54"} Apr 24 19:33:02.710158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.709684 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerStarted","Data":"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134"} Apr 24 19:33:02.710158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.709982 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:33:02.710158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.710012 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:33:02.711425 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.711399 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:02.729358 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:02.729301 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podStartSLOduration=7.7292871210000005 podStartE2EDuration="7.729287121s" podCreationTimestamp="2026-04-24 19:32:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:33:02.72763885 +0000 UTC m=+1576.051883024" watchObservedRunningTime="2026-04-24 19:33:02.729287121 +0000 UTC m=+1576.053531284" Apr 24 19:33:03.712701 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:03.712651 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:08.717616 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:08.717585 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:33:08.718259 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:08.718231 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:18.718945 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:18.718903 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:28.718631 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:28.718582 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:38.719015 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:38.718911 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:48.719429 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:48.719399 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:33:57.385298 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.385251 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:33:57.385802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.385699 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" containerID="cri-o://e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134" gracePeriod=30 Apr 24 19:33:57.385802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.385749 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kube-rbac-proxy" containerID="cri-o://3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54" gracePeriod=30 Apr 24 19:33:57.506205 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506175 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:33:57.506464 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506452 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="storage-initializer" Apr 24 19:33:57.506519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506465 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="storage-initializer" Apr 24 19:33:57.506519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506479 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kube-rbac-proxy" Apr 24 19:33:57.506519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506484 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kube-rbac-proxy" Apr 24 19:33:57.506519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506500 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" Apr 24 19:33:57.506519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506506 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" Apr 24 19:33:57.506696 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506576 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kube-rbac-proxy" Apr 24 19:33:57.506696 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.506588 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8bd825e5-e415-49a5-84a6-c4180dcdd073" containerName="kserve-container" Apr 24 19:33:57.509445 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.509424 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.511730 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.511699 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 24 19:33:57.511830 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.511699 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 24 19:33:57.519921 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.519896 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:33:57.564306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.564272 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbqh4\" (UniqueName: \"kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.564306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.564314 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.564524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.564380 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.564524 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.564488 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.665618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.665504 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.665618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.665584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbqh4\" (UniqueName: \"kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.665618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.665608 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.665879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.665644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.665879 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:33:57.665737 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-predictor-serving-cert: secret "isvc-pmml-predictor-serving-cert" not found Apr 24 19:33:57.665879 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:33:57.665807 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls podName:1e20f801-56e0-4215-adbb-ee401b14af1e nodeName:}" failed. No retries permitted until 2026-04-24 19:33:58.165784768 +0000 UTC m=+1631.490028911 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls") pod "isvc-pmml-predictor-8bb578669-slm29" (UID: "1e20f801-56e0-4215-adbb-ee401b14af1e") : secret "isvc-pmml-predictor-serving-cert" not found Apr 24 19:33:57.666113 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.666090 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.666353 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.666338 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.675978 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.675945 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbqh4\" (UniqueName: \"kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:57.871804 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.871767 2564 generic.go:358] "Generic (PLEG): container finished" podID="5ba4c06f-efab-4839-a284-228cc474b54f" containerID="3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54" exitCode=2 Apr 24 19:33:57.871985 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:57.871843 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerDied","Data":"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54"} Apr 24 19:33:58.170010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.169962 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:58.172450 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.172425 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-slm29\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:58.419789 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.419748 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:33:58.542056 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.542020 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:33:58.545414 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:33:58.545386 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e20f801_56e0_4215_adbb_ee401b14af1e.slice/crio-fc05d9b601c15f1752b3f826d9ec53251ce18c1de2a495fbfd4e9deec315d183 WatchSource:0}: Error finding container fc05d9b601c15f1752b3f826d9ec53251ce18c1de2a495fbfd4e9deec315d183: Status 404 returned error can't find the container with id fc05d9b601c15f1752b3f826d9ec53251ce18c1de2a495fbfd4e9deec315d183 Apr 24 19:33:58.713224 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.713186 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 24 19:33:58.718343 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.718301 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 24 19:33:58.875981 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.875947 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerStarted","Data":"b2b993f6d43a8f5231f1839e0c7d216c386bacb8522ba7b7172bf22b56724652"} Apr 24 19:33:58.876175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:33:58.875990 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerStarted","Data":"fc05d9b601c15f1752b3f826d9ec53251ce18c1de2a495fbfd4e9deec315d183"} Apr 24 19:34:00.329340 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.329315 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:34:00.388516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.388429 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjkng\" (UniqueName: \"kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng\") pod \"5ba4c06f-efab-4839-a284-228cc474b54f\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " Apr 24 19:34:00.388516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.388489 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"5ba4c06f-efab-4839-a284-228cc474b54f\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " Apr 24 19:34:00.388516 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.388518 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location\") pod \"5ba4c06f-efab-4839-a284-228cc474b54f\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " Apr 24 19:34:00.388875 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.388535 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls\") pod \"5ba4c06f-efab-4839-a284-228cc474b54f\" (UID: \"5ba4c06f-efab-4839-a284-228cc474b54f\") " Apr 24 19:34:00.388974 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.388925 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "5ba4c06f-efab-4839-a284-228cc474b54f" (UID: "5ba4c06f-efab-4839-a284-228cc474b54f"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:34:00.390695 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.390660 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5ba4c06f-efab-4839-a284-228cc474b54f" (UID: "5ba4c06f-efab-4839-a284-228cc474b54f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:34:00.390800 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.390773 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng" (OuterVolumeSpecName: "kube-api-access-sjkng") pod "5ba4c06f-efab-4839-a284-228cc474b54f" (UID: "5ba4c06f-efab-4839-a284-228cc474b54f"). InnerVolumeSpecName "kube-api-access-sjkng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:34:00.398874 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.398844 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ba4c06f-efab-4839-a284-228cc474b54f" (UID: "5ba4c06f-efab-4839-a284-228cc474b54f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:34:00.489403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.489368 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sjkng\" (UniqueName: \"kubernetes.io/projected/5ba4c06f-efab-4839-a284-228cc474b54f-kube-api-access-sjkng\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:34:00.489403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.489397 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5ba4c06f-efab-4839-a284-228cc474b54f-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:34:00.489403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.489408 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ba4c06f-efab-4839-a284-228cc474b54f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:34:00.489676 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.489418 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5ba4c06f-efab-4839-a284-228cc474b54f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:34:00.882606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.882544 2564 generic.go:358] "Generic (PLEG): container finished" podID="5ba4c06f-efab-4839-a284-228cc474b54f" containerID="e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134" exitCode=0 Apr 24 19:34:00.882784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.882614 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerDied","Data":"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134"} Apr 24 19:34:00.882784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.882638 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" event={"ID":"5ba4c06f-efab-4839-a284-228cc474b54f","Type":"ContainerDied","Data":"fdf95fccbdc3318ebf77d69d1cf906322aaf29530d4157654c7a9caec1e6bf5c"} Apr 24 19:34:00.882784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.882645 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2" Apr 24 19:34:00.882784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.882653 2564 scope.go:117] "RemoveContainer" containerID="3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54" Apr 24 19:34:00.891001 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.890983 2564 scope.go:117] "RemoveContainer" containerID="e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134" Apr 24 19:34:00.897664 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.897642 2564 scope.go:117] "RemoveContainer" containerID="06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6" Apr 24 19:34:00.903084 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.903056 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:34:00.904670 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.904653 2564 scope.go:117] "RemoveContainer" containerID="3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54" Apr 24 19:34:00.904900 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:34:00.904881 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54\": container with ID starting with 3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54 not found: ID does not exist" containerID="3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54" Apr 24 19:34:00.904973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.904913 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54"} err="failed to get container status \"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54\": rpc error: code = NotFound desc = could not find container \"3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54\": container with ID starting with 3b5c3f5f2cf6c1fa2501c2ed0fc37acd11562fa0390cb3a5d285fa7e46ff3d54 not found: ID does not exist" Apr 24 19:34:00.904973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.904938 2564 scope.go:117] "RemoveContainer" containerID="e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134" Apr 24 19:34:00.905185 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:34:00.905166 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134\": container with ID starting with e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134 not found: ID does not exist" containerID="e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134" Apr 24 19:34:00.905235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.905192 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134"} err="failed to get container status \"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134\": rpc error: code = NotFound desc = could not find container \"e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134\": container with ID starting with e19fa15412b8e52478341292f6a5a8bf90027b78ecc7a5a207ea5e873a168134 not found: ID does not exist" Apr 24 19:34:00.905235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.905213 2564 scope.go:117] "RemoveContainer" containerID="06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6" Apr 24 19:34:00.905427 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:34:00.905409 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6\": container with ID starting with 06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6 not found: ID does not exist" containerID="06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6" Apr 24 19:34:00.905494 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.905429 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6"} err="failed to get container status \"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6\": rpc error: code = NotFound desc = could not find container \"06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6\": container with ID starting with 06ed23f2b9696e0ffd8c0b39a86af87a41b72e4689f93d36c7007db7316518c6 not found: ID does not exist" Apr 24 19:34:00.907322 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:00.907301 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-2fvh2"] Apr 24 19:34:01.184104 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:01.184027 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" path="/var/lib/kubelet/pods/5ba4c06f-efab-4839-a284-228cc474b54f/volumes" Apr 24 19:34:02.890475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:02.890387 2564 generic.go:358] "Generic (PLEG): container finished" podID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerID="b2b993f6d43a8f5231f1839e0c7d216c386bacb8522ba7b7172bf22b56724652" exitCode=0 Apr 24 19:34:02.890475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:02.890438 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerDied","Data":"b2b993f6d43a8f5231f1839e0c7d216c386bacb8522ba7b7172bf22b56724652"} Apr 24 19:34:09.917009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:09.916918 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerStarted","Data":"0ee9eba74e65c234c17cb3ca109a33be55c9a1e8149456164b741c0e6e42c957"} Apr 24 19:34:09.917009 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:09.916959 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerStarted","Data":"f2e6712144a9944f9dca2362f684ea00f8a22e8aecd48321c3448952b554c9d2"} Apr 24 19:34:09.917401 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:09.917167 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:34:09.939349 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:09.939303 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podStartSLOduration=6.24118795 podStartE2EDuration="12.939288179s" podCreationTimestamp="2026-04-24 19:33:57 +0000 UTC" firstStartedPulling="2026-04-24 19:34:02.891624765 +0000 UTC m=+1636.215868905" lastFinishedPulling="2026-04-24 19:34:09.589724989 +0000 UTC m=+1642.913969134" observedRunningTime="2026-04-24 19:34:09.937282723 +0000 UTC m=+1643.261526885" watchObservedRunningTime="2026-04-24 19:34:09.939288179 +0000 UTC m=+1643.263532341" Apr 24 19:34:10.920400 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:10.920367 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:34:10.921468 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:10.921425 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:11.922716 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:11.922675 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:16.927726 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:16.927695 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:34:16.928380 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:16.928345 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:26.928973 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:26.928926 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:36.928675 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:36.928635 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:46.928891 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:46.928836 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:34:56.928505 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:34:56.928464 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:35:06.928526 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:06.928434 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:35:16.928628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:16.928577 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:35:26.928781 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:26.928739 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 24 19:35:36.928712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:36.928683 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:35:38.495225 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.495192 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:35:38.495747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.495494 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" containerID="cri-o://f2e6712144a9944f9dca2362f684ea00f8a22e8aecd48321c3448952b554c9d2" gracePeriod=30 Apr 24 19:35:38.495747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.495591 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kube-rbac-proxy" containerID="cri-o://0ee9eba74e65c234c17cb3ca109a33be55c9a1e8149456164b741c0e6e42c957" gracePeriod=30 Apr 24 19:35:38.591214 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591179 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:35:38.591574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591540 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" Apr 24 19:35:38.591574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591576 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591588 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kube-rbac-proxy" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591596 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kube-rbac-proxy" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591631 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="storage-initializer" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591642 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="storage-initializer" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591718 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kserve-container" Apr 24 19:35:38.591737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.591732 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ba4c06f-efab-4839-a284-228cc474b54f" containerName="kube-rbac-proxy" Apr 24 19:35:38.594960 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.594937 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.597310 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.597292 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:35:38.597397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.597346 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 24 19:35:38.603864 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.603839 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:35:38.726361 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.726314 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.726583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.726371 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.726583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.726436 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6f95\" (UniqueName: \"kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.726583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.726508 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.827763 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.827679 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.827763 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.827721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.827957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.827866 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6f95\" (UniqueName: \"kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.828018 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.827958 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.828121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.828096 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.828542 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.828517 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.830189 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.830168 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.836122 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.836098 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6f95\" (UniqueName: \"kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95\") pod \"isvc-pmml-runtime-predictor-67bc544947-52ktx\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:38.906700 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:38.906661 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:39.027731 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:39.027697 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:35:39.030726 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:35:39.030699 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod554bd413_c83e_4c02_8107_29fe0d54f8ee.slice/crio-5eaa1a83a9a8a724472313e91bb7a418855057f9481e7016b3917856181da228 WatchSource:0}: Error finding container 5eaa1a83a9a8a724472313e91bb7a418855057f9481e7016b3917856181da228: Status 404 returned error can't find the container with id 5eaa1a83a9a8a724472313e91bb7a418855057f9481e7016b3917856181da228 Apr 24 19:35:39.169320 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:39.169289 2564 generic.go:358] "Generic (PLEG): container finished" podID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerID="0ee9eba74e65c234c17cb3ca109a33be55c9a1e8149456164b741c0e6e42c957" exitCode=2 Apr 24 19:35:39.169526 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:39.169372 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerDied","Data":"0ee9eba74e65c234c17cb3ca109a33be55c9a1e8149456164b741c0e6e42c957"} Apr 24 19:35:39.170752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:39.170725 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerStarted","Data":"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820"} Apr 24 19:35:39.170752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:39.170752 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerStarted","Data":"5eaa1a83a9a8a724472313e91bb7a418855057f9481e7016b3917856181da228"} Apr 24 19:35:41.923185 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:41.923131 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 24 19:35:42.183113 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.183011 2564 generic.go:358] "Generic (PLEG): container finished" podID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerID="f2e6712144a9944f9dca2362f684ea00f8a22e8aecd48321c3448952b554c9d2" exitCode=0 Apr 24 19:35:42.183286 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.183149 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerDied","Data":"f2e6712144a9944f9dca2362f684ea00f8a22e8aecd48321c3448952b554c9d2"} Apr 24 19:35:42.237032 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.237008 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:35:42.357976 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.357934 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location\") pod \"1e20f801-56e0-4215-adbb-ee401b14af1e\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " Apr 24 19:35:42.358174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.357987 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbqh4\" (UniqueName: \"kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4\") pod \"1e20f801-56e0-4215-adbb-ee401b14af1e\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " Apr 24 19:35:42.358174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.358024 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"1e20f801-56e0-4215-adbb-ee401b14af1e\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " Apr 24 19:35:42.358174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.358086 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") pod \"1e20f801-56e0-4215-adbb-ee401b14af1e\" (UID: \"1e20f801-56e0-4215-adbb-ee401b14af1e\") " Apr 24 19:35:42.358354 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.358267 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1e20f801-56e0-4215-adbb-ee401b14af1e" (UID: "1e20f801-56e0-4215-adbb-ee401b14af1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:35:42.358484 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.358459 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "1e20f801-56e0-4215-adbb-ee401b14af1e" (UID: "1e20f801-56e0-4215-adbb-ee401b14af1e"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:35:42.358540 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.358468 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1e20f801-56e0-4215-adbb-ee401b14af1e-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:35:42.360123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.360100 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4" (OuterVolumeSpecName: "kube-api-access-bbqh4") pod "1e20f801-56e0-4215-adbb-ee401b14af1e" (UID: "1e20f801-56e0-4215-adbb-ee401b14af1e"). InnerVolumeSpecName "kube-api-access-bbqh4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:35:42.360187 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.360141 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1e20f801-56e0-4215-adbb-ee401b14af1e" (UID: "1e20f801-56e0-4215-adbb-ee401b14af1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:35:42.459428 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.459388 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbqh4\" (UniqueName: \"kubernetes.io/projected/1e20f801-56e0-4215-adbb-ee401b14af1e-kube-api-access-bbqh4\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:35:42.459428 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.459428 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1e20f801-56e0-4215-adbb-ee401b14af1e-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:35:42.459680 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:42.459445 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1e20f801-56e0-4215-adbb-ee401b14af1e-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:35:43.187298 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.187265 2564 generic.go:358] "Generic (PLEG): container finished" podID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerID="a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820" exitCode=0 Apr 24 19:35:43.187796 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.187335 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerDied","Data":"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820"} Apr 24 19:35:43.189134 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.189112 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" event={"ID":"1e20f801-56e0-4215-adbb-ee401b14af1e","Type":"ContainerDied","Data":"fc05d9b601c15f1752b3f826d9ec53251ce18c1de2a495fbfd4e9deec315d183"} Apr 24 19:35:43.189245 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.189154 2564 scope.go:117] "RemoveContainer" containerID="0ee9eba74e65c234c17cb3ca109a33be55c9a1e8149456164b741c0e6e42c957" Apr 24 19:35:43.189245 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.189199 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29" Apr 24 19:35:43.202608 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.202508 2564 scope.go:117] "RemoveContainer" containerID="f2e6712144a9944f9dca2362f684ea00f8a22e8aecd48321c3448952b554c9d2" Apr 24 19:35:43.214373 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.214358 2564 scope.go:117] "RemoveContainer" containerID="b2b993f6d43a8f5231f1839e0c7d216c386bacb8522ba7b7172bf22b56724652" Apr 24 19:35:43.217759 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.217736 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:35:43.221393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:43.221372 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-slm29"] Apr 24 19:35:44.194154 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:44.194117 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerStarted","Data":"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c"} Apr 24 19:35:44.194154 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:44.194157 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerStarted","Data":"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e"} Apr 24 19:35:44.194728 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:44.194385 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:44.213476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:44.213423 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podStartSLOduration=6.213407527 podStartE2EDuration="6.213407527s" podCreationTimestamp="2026-04-24 19:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:35:44.21263333 +0000 UTC m=+1737.536877496" watchObservedRunningTime="2026-04-24 19:35:44.213407527 +0000 UTC m=+1737.537651691" Apr 24 19:35:45.183831 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:45.183798 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" path="/var/lib/kubelet/pods/1e20f801-56e0-4215-adbb-ee401b14af1e/volumes" Apr 24 19:35:45.197946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:45.197911 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:45.198981 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:45.198955 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:35:46.199922 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:46.199874 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:35:51.204541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:51.204509 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:35:51.205145 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:35:51.205114 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:01.205837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:01.205790 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:11.206006 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:11.205966 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:21.205192 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:21.205150 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:31.205254 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:31.205212 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:41.205686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:41.205602 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:36:51.205168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:36:51.205119 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:37:01.205635 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:01.205593 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 24 19:37:09.185094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:09.185065 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:37:19.696429 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.696395 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:37:19.696859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.696723 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" containerID="cri-o://7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e" gracePeriod=30 Apr 24 19:37:19.696859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.696742 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kube-rbac-proxy" containerID="cri-o://0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c" gracePeriod=30 Apr 24 19:37:19.804677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.804645 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:37:19.804992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.804978 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kube-rbac-proxy" Apr 24 19:37:19.805039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.804994 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kube-rbac-proxy" Apr 24 19:37:19.805039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805009 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="storage-initializer" Apr 24 19:37:19.805039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805014 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="storage-initializer" Apr 24 19:37:19.805039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805025 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" Apr 24 19:37:19.805039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805031 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" Apr 24 19:37:19.805211 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805085 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kserve-container" Apr 24 19:37:19.805211 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.805096 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e20f801-56e0-4215-adbb-ee401b14af1e" containerName="kube-rbac-proxy" Apr 24 19:37:19.808036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.808015 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:19.810661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.810634 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 24 19:37:19.810799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.810636 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 24 19:37:19.820518 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.820488 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:37:19.919213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.919173 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:19.919213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.919220 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:19.919477 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.919312 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4j7f\" (UniqueName: \"kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:19.919477 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:19.919347 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.019957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.019863 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.019957 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.019922 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4j7f\" (UniqueName: \"kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.020259 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.020031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.020259 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.020127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.020427 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.020404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.020692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.020670 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.022440 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.022422 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.032498 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.032475 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4j7f\" (UniqueName: \"kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.118070 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.118025 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:20.239567 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.239513 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:37:20.243171 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:37:20.243145 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bed84f9_3a48_4310_8334_d806f9999cc6.slice/crio-85617ac532db338003fde61a1e8110b575b3a176f8ce0ca82cd8667e48d68e9e WatchSource:0}: Error finding container 85617ac532db338003fde61a1e8110b575b3a176f8ce0ca82cd8667e48d68e9e: Status 404 returned error can't find the container with id 85617ac532db338003fde61a1e8110b575b3a176f8ce0ca82cd8667e48d68e9e Apr 24 19:37:20.244998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.244980 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:37:20.455240 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.455209 2564 generic.go:358] "Generic (PLEG): container finished" podID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerID="0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c" exitCode=2 Apr 24 19:37:20.455427 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.455289 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerDied","Data":"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c"} Apr 24 19:37:20.456533 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.456507 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerStarted","Data":"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7"} Apr 24 19:37:20.456658 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:20.456540 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerStarted","Data":"85617ac532db338003fde61a1e8110b575b3a176f8ce0ca82cd8667e48d68e9e"} Apr 24 19:37:21.200739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:21.200695 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 24 19:37:23.343493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.343465 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:37:23.451583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.451541 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location\") pod \"554bd413-c83e-4c02-8107-29fe0d54f8ee\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " Apr 24 19:37:23.451756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.451620 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6f95\" (UniqueName: \"kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95\") pod \"554bd413-c83e-4c02-8107-29fe0d54f8ee\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " Apr 24 19:37:23.451756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.451649 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls\") pod \"554bd413-c83e-4c02-8107-29fe0d54f8ee\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " Apr 24 19:37:23.451756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.451679 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"554bd413-c83e-4c02-8107-29fe0d54f8ee\" (UID: \"554bd413-c83e-4c02-8107-29fe0d54f8ee\") " Apr 24 19:37:23.451936 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.451907 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "554bd413-c83e-4c02-8107-29fe0d54f8ee" (UID: "554bd413-c83e-4c02-8107-29fe0d54f8ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:37:23.452121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.452092 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "554bd413-c83e-4c02-8107-29fe0d54f8ee" (UID: "554bd413-c83e-4c02-8107-29fe0d54f8ee"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:37:23.453820 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.453799 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95" (OuterVolumeSpecName: "kube-api-access-w6f95") pod "554bd413-c83e-4c02-8107-29fe0d54f8ee" (UID: "554bd413-c83e-4c02-8107-29fe0d54f8ee"). InnerVolumeSpecName "kube-api-access-w6f95". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:37:23.453903 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.453882 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "554bd413-c83e-4c02-8107-29fe0d54f8ee" (UID: "554bd413-c83e-4c02-8107-29fe0d54f8ee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:37:23.466533 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.466504 2564 generic.go:358] "Generic (PLEG): container finished" podID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerID="7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e" exitCode=0 Apr 24 19:37:23.466661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.466586 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerDied","Data":"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e"} Apr 24 19:37:23.466661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.466600 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" Apr 24 19:37:23.466661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.466630 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx" event={"ID":"554bd413-c83e-4c02-8107-29fe0d54f8ee","Type":"ContainerDied","Data":"5eaa1a83a9a8a724472313e91bb7a418855057f9481e7016b3917856181da228"} Apr 24 19:37:23.466661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.466648 2564 scope.go:117] "RemoveContainer" containerID="0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c" Apr 24 19:37:23.474681 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.474534 2564 scope.go:117] "RemoveContainer" containerID="7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e" Apr 24 19:37:23.482002 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.481985 2564 scope.go:117] "RemoveContainer" containerID="a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820" Apr 24 19:37:23.488013 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.487991 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:37:23.489125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489110 2564 scope.go:117] "RemoveContainer" containerID="0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c" Apr 24 19:37:23.489378 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:37:23.489358 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c\": container with ID starting with 0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c not found: ID does not exist" containerID="0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c" Apr 24 19:37:23.489424 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489390 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c"} err="failed to get container status \"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c\": rpc error: code = NotFound desc = could not find container \"0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c\": container with ID starting with 0380cfee34ffe2c1e69476b34f5329aeeadc1727765e093817166cdbfc47750c not found: ID does not exist" Apr 24 19:37:23.489424 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489409 2564 scope.go:117] "RemoveContainer" containerID="7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e" Apr 24 19:37:23.489670 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:37:23.489652 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e\": container with ID starting with 7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e not found: ID does not exist" containerID="7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e" Apr 24 19:37:23.489717 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489677 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e"} err="failed to get container status \"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e\": rpc error: code = NotFound desc = could not find container \"7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e\": container with ID starting with 7b0d66d2bb81fdd9e1fe4ca47b339d2deff33d285e90e6bf4618347f79c6128e not found: ID does not exist" Apr 24 19:37:23.489717 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489693 2564 scope.go:117] "RemoveContainer" containerID="a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820" Apr 24 19:37:23.489910 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:37:23.489894 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820\": container with ID starting with a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820 not found: ID does not exist" containerID="a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820" Apr 24 19:37:23.489946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.489914 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820"} err="failed to get container status \"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820\": rpc error: code = NotFound desc = could not find container \"a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820\": container with ID starting with a75cba3475fdf09bd691f536e51ec725e7400164a3593f7bc91ab4e69b1db820 not found: ID does not exist" Apr 24 19:37:23.492830 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.492806 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-52ktx"] Apr 24 19:37:23.552606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.552573 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/554bd413-c83e-4c02-8107-29fe0d54f8ee-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:37:23.552606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.552602 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6f95\" (UniqueName: \"kubernetes.io/projected/554bd413-c83e-4c02-8107-29fe0d54f8ee-kube-api-access-w6f95\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:37:23.552606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.552614 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/554bd413-c83e-4c02-8107-29fe0d54f8ee-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:37:23.552817 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:23.552625 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/554bd413-c83e-4c02-8107-29fe0d54f8ee-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:37:24.474083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:24.474049 2564 generic.go:358] "Generic (PLEG): container finished" podID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerID="3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7" exitCode=0 Apr 24 19:37:24.474445 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:24.474124 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerDied","Data":"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7"} Apr 24 19:37:25.184461 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:25.184429 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" path="/var/lib/kubelet/pods/554bd413-c83e-4c02-8107-29fe0d54f8ee/volumes" Apr 24 19:37:25.478930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:25.478894 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerStarted","Data":"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905"} Apr 24 19:37:25.478930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:25.478931 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerStarted","Data":"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0"} Apr 24 19:37:25.479489 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:25.479162 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:25.498861 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:25.498816 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podStartSLOduration=6.498800071 podStartE2EDuration="6.498800071s" podCreationTimestamp="2026-04-24 19:37:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:37:25.496959433 +0000 UTC m=+1838.821203619" watchObservedRunningTime="2026-04-24 19:37:25.498800071 +0000 UTC m=+1838.823044236" Apr 24 19:37:26.481395 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:26.481362 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:26.482705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:26.482678 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:37:27.484067 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:27.484023 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:37:32.488879 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:32.488850 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:37:32.489465 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:32.489437 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:37:42.489359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:42.489315 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:37:52.490188 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:37:52.490143 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:02.489696 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:02.489655 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:12.489886 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:12.489787 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:22.489364 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:22.489313 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:32.490325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:32.490284 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:42.489813 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:42.489768 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 24 19:38:47.184781 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:47.184746 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:38:50.787286 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.787244 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:38:50.787686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.787591 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" containerID="cri-o://4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0" gracePeriod=30 Apr 24 19:38:50.787758 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.787645 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kube-rbac-proxy" containerID="cri-o://0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905" gracePeriod=30 Apr 24 19:38:50.922428 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922396 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:38:50.922751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922721 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" Apr 24 19:38:50.922801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922754 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" Apr 24 19:38:50.922801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922766 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kube-rbac-proxy" Apr 24 19:38:50.922801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922773 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kube-rbac-proxy" Apr 24 19:38:50.922801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922784 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="storage-initializer" Apr 24 19:38:50.922801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922792 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="storage-initializer" Apr 24 19:38:50.922954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922846 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kserve-container" Apr 24 19:38:50.922954 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.922854 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="554bd413-c83e-4c02-8107-29fe0d54f8ee" containerName="kube-rbac-proxy" Apr 24 19:38:50.925835 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.925814 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:50.928708 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.928687 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a07d9f-predictor-serving-cert\"" Apr 24 19:38:50.928833 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.928714 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\"" Apr 24 19:38:50.950447 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:50.950416 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:38:51.020007 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.019958 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d85h6\" (UniqueName: \"kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.020007 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.020000 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.020237 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.020046 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.020237 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.020087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.120511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.120417 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.120511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.120479 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.120511 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.120501 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d85h6\" (UniqueName: \"kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.120847 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.120519 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.120847 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:38:51.120667 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-serving-cert: secret "isvc-primary-a07d9f-predictor-serving-cert" not found Apr 24 19:38:51.120847 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:38:51.120763 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls podName:494037ef-4514-4b97-b93f-7df80965ae7e nodeName:}" failed. No retries permitted until 2026-04-24 19:38:51.620741815 +0000 UTC m=+1924.944985955 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls") pod "isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" (UID: "494037ef-4514-4b97-b93f-7df80965ae7e") : secret "isvc-primary-a07d9f-predictor-serving-cert" not found Apr 24 19:38:51.121014 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.120938 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.121225 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.121204 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.129512 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.129485 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d85h6\" (UniqueName: \"kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.624287 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.624229 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.626704 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.626674 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") pod \"isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.716849 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.716791 2564 generic.go:358] "Generic (PLEG): container finished" podID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerID="0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905" exitCode=2 Apr 24 19:38:51.717019 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.716860 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerDied","Data":"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905"} Apr 24 19:38:51.835407 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.835371 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:51.954767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:51.954739 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:38:51.957465 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:38:51.957437 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod494037ef_4514_4b97_b93f_7df80965ae7e.slice/crio-02d048df1af5e8cbfe1287da1879c50ee3f0d0cb52387eb48c40c402b3f42e27 WatchSource:0}: Error finding container 02d048df1af5e8cbfe1287da1879c50ee3f0d0cb52387eb48c40c402b3f42e27: Status 404 returned error can't find the container with id 02d048df1af5e8cbfe1287da1879c50ee3f0d0cb52387eb48c40c402b3f42e27 Apr 24 19:38:52.484458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:52.484419 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 24 19:38:52.722039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:52.721991 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerStarted","Data":"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0"} Apr 24 19:38:52.722039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:52.722037 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerStarted","Data":"02d048df1af5e8cbfe1287da1879c50ee3f0d0cb52387eb48c40c402b3f42e27"} Apr 24 19:38:54.522600 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.522544 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:38:54.648039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.647944 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4j7f\" (UniqueName: \"kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f\") pod \"1bed84f9-3a48-4310-8334-d806f9999cc6\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " Apr 24 19:38:54.648039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.648022 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"1bed84f9-3a48-4310-8334-d806f9999cc6\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " Apr 24 19:38:54.648039 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.648043 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location\") pod \"1bed84f9-3a48-4310-8334-d806f9999cc6\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " Apr 24 19:38:54.648332 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.648067 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls\") pod \"1bed84f9-3a48-4310-8334-d806f9999cc6\" (UID: \"1bed84f9-3a48-4310-8334-d806f9999cc6\") " Apr 24 19:38:54.648423 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.648394 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "1bed84f9-3a48-4310-8334-d806f9999cc6" (UID: "1bed84f9-3a48-4310-8334-d806f9999cc6"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:38:54.648479 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.648414 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1bed84f9-3a48-4310-8334-d806f9999cc6" (UID: "1bed84f9-3a48-4310-8334-d806f9999cc6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:38:54.650099 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.650077 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1bed84f9-3a48-4310-8334-d806f9999cc6" (UID: "1bed84f9-3a48-4310-8334-d806f9999cc6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:38:54.650165 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.650081 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f" (OuterVolumeSpecName: "kube-api-access-k4j7f") pod "1bed84f9-3a48-4310-8334-d806f9999cc6" (UID: "1bed84f9-3a48-4310-8334-d806f9999cc6"). InnerVolumeSpecName "kube-api-access-k4j7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:38:54.728515 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.728477 2564 generic.go:358] "Generic (PLEG): container finished" podID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerID="4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0" exitCode=0 Apr 24 19:38:54.728732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.728522 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerDied","Data":"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0"} Apr 24 19:38:54.728732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.728569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" event={"ID":"1bed84f9-3a48-4310-8334-d806f9999cc6","Type":"ContainerDied","Data":"85617ac532db338003fde61a1e8110b575b3a176f8ce0ca82cd8667e48d68e9e"} Apr 24 19:38:54.728732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.728585 2564 scope.go:117] "RemoveContainer" containerID="0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905" Apr 24 19:38:54.728732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.728592 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t" Apr 24 19:38:54.736669 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.736479 2564 scope.go:117] "RemoveContainer" containerID="4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0" Apr 24 19:38:54.743755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.743738 2564 scope.go:117] "RemoveContainer" containerID="3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7" Apr 24 19:38:54.748933 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.748889 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1bed84f9-3a48-4310-8334-d806f9999cc6-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:38:54.748933 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.748930 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k4j7f\" (UniqueName: \"kubernetes.io/projected/1bed84f9-3a48-4310-8334-d806f9999cc6-kube-api-access-k4j7f\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:38:54.749105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.748949 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1bed84f9-3a48-4310-8334-d806f9999cc6-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:38:54.749105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.748968 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1bed84f9-3a48-4310-8334-d806f9999cc6-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:38:54.751069 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751046 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:38:54.751148 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751100 2564 scope.go:117] "RemoveContainer" containerID="0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905" Apr 24 19:38:54.751409 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:38:54.751392 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905\": container with ID starting with 0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905 not found: ID does not exist" containerID="0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905" Apr 24 19:38:54.751453 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751420 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905"} err="failed to get container status \"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905\": rpc error: code = NotFound desc = could not find container \"0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905\": container with ID starting with 0dcafb063eeb7aaeb961b216d1eaf1d6a11dd1906bdb92ad903c56759e93b905 not found: ID does not exist" Apr 24 19:38:54.751453 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751440 2564 scope.go:117] "RemoveContainer" containerID="4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0" Apr 24 19:38:54.751719 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:38:54.751702 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0\": container with ID starting with 4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0 not found: ID does not exist" containerID="4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0" Apr 24 19:38:54.751780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751724 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0"} err="failed to get container status \"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0\": rpc error: code = NotFound desc = could not find container \"4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0\": container with ID starting with 4ef4ba689933d168abaa14377466400ade4e9b89671fefa739eeb29a890844b0 not found: ID does not exist" Apr 24 19:38:54.751780 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.751750 2564 scope.go:117] "RemoveContainer" containerID="3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7" Apr 24 19:38:54.752007 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:38:54.751988 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7\": container with ID starting with 3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7 not found: ID does not exist" containerID="3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7" Apr 24 19:38:54.752081 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.752015 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7"} err="failed to get container status \"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7\": rpc error: code = NotFound desc = could not find container \"3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7\": container with ID starting with 3da32ede1de5e1723f90651db25e1d0228567ea11085cda0fdea7f7cf1f8a7b7 not found: ID does not exist" Apr 24 19:38:54.757725 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:54.757699 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-nlh4t"] Apr 24 19:38:55.184539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:55.184505 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" path="/var/lib/kubelet/pods/1bed84f9-3a48-4310-8334-d806f9999cc6/volumes" Apr 24 19:38:56.737291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:56.737255 2564 generic.go:358] "Generic (PLEG): container finished" podID="494037ef-4514-4b97-b93f-7df80965ae7e" containerID="36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0" exitCode=0 Apr 24 19:38:56.737791 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:56.737318 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerDied","Data":"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0"} Apr 24 19:38:57.743119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.743087 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerStarted","Data":"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533"} Apr 24 19:38:57.743532 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.743127 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerStarted","Data":"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf"} Apr 24 19:38:57.743532 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.743408 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:57.743532 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.743515 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:38:57.744884 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.744857 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:38:57.762088 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:57.762040 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podStartSLOduration=7.762025282 podStartE2EDuration="7.762025282s" podCreationTimestamp="2026-04-24 19:38:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:38:57.760939432 +0000 UTC m=+1931.085183624" watchObservedRunningTime="2026-04-24 19:38:57.762025282 +0000 UTC m=+1931.086269445" Apr 24 19:38:58.746139 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:38:58.746100 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:03.751220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:03.751188 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:39:03.751779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:03.751753 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:13.752440 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:13.752393 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:23.751654 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:23.751612 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:33.752466 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:33.752423 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:43.752152 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:43.752061 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:39:53.752344 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:39:53.752301 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 24 19:40:03.752408 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:03.752374 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:40:11.084295 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084255 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084597 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084609 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084623 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kube-rbac-proxy" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084628 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kube-rbac-proxy" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084646 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="storage-initializer" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084652 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="storage-initializer" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084700 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kserve-container" Apr 24 19:40:11.084767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.084706 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1bed84f9-3a48-4310-8334-d806f9999cc6" containerName="kube-rbac-proxy" Apr 24 19:40:11.087831 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.087808 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.090490 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.090469 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\"" Apr 24 19:40:11.090610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.090511 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a07d9f-dockercfg-r47jr\"" Apr 24 19:40:11.090819 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.090803 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 19:40:11.091235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.091219 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-a07d9f-predictor-serving-cert\"" Apr 24 19:40:11.091389 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.091372 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a07d9f\"" Apr 24 19:40:11.100107 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.100081 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:11.215688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.215648 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.215688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.215694 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kg8x\" (UniqueName: \"kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.215941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.215769 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.215941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.215806 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.215941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.215830 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317097 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317109 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317202 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317228 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kg8x\" (UniqueName: \"kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317691 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317849 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.317946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.317892 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.319565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.319527 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.325650 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.325623 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kg8x\" (UniqueName: \"kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x\") pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.398534 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.398442 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:11.516539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.516512 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:11.519197 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:40:11.519168 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ccd68f3_ff9a_4ee4_90f5_45614b48dc0d.slice/crio-a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526 WatchSource:0}: Error finding container a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526: Status 404 returned error can't find the container with id a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526 Apr 24 19:40:11.948570 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.948523 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerStarted","Data":"8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8"} Apr 24 19:40:11.948570 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:11.948569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerStarted","Data":"a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526"} Apr 24 19:40:16.963688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:16.963656 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/0.log" Apr 24 19:40:16.964052 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:16.963701 2564 generic.go:358] "Generic (PLEG): container finished" podID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerID="8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8" exitCode=1 Apr 24 19:40:16.964052 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:16.963741 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerDied","Data":"8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8"} Apr 24 19:40:17.968306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:17.968276 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/0.log" Apr 24 19:40:17.968740 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:17.968328 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerStarted","Data":"bf7302462465147c302767ff3d1c4169a7ece798c512a572d32187d399c9859e"} Apr 24 19:40:21.979278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.979245 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/1.log" Apr 24 19:40:21.979756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.979610 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/0.log" Apr 24 19:40:21.979756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.979644 2564 generic.go:358] "Generic (PLEG): container finished" podID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerID="bf7302462465147c302767ff3d1c4169a7ece798c512a572d32187d399c9859e" exitCode=1 Apr 24 19:40:21.979756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.979701 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerDied","Data":"bf7302462465147c302767ff3d1c4169a7ece798c512a572d32187d399c9859e"} Apr 24 19:40:21.979756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.979735 2564 scope.go:117] "RemoveContainer" containerID="8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8" Apr 24 19:40:21.980137 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:21.980121 2564 scope.go:117] "RemoveContainer" containerID="8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8" Apr 24 19:40:21.989991 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:21.989966 2564 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_kserve-ci-e2e-test_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d_0 in pod sandbox a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526 from index: no such id: '8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8'" containerID="8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8" Apr 24 19:40:21.990081 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:21.990016 2564 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_kserve-ci-e2e-test_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d_0 in pod sandbox a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526 from index: no such id: '8e0c0d30d5459f80a07c74818c3d9b9d25d8359a232c2d9e5939775b3a44cee8'; Skipping pod \"isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_kserve-ci-e2e-test(9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d)\"" logger="UnhandledError" Apr 24 19:40:21.991354 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:21.991333 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_kserve-ci-e2e-test(9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d)\"" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" Apr 24 19:40:22.983677 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:22.983648 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/1.log" Apr 24 19:40:29.105302 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.105270 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:29.152951 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.152917 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:40:29.153291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.153259 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" containerID="cri-o://9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf" gracePeriod=30 Apr 24 19:40:29.153394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.153335 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kube-rbac-proxy" containerID="cri-o://4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533" gracePeriod=30 Apr 24 19:40:29.250379 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.250344 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:29.255018 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.255001 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.261049 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.261024 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-6c0429\"" Apr 24 19:40:29.261188 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.261070 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-6c0429-dockercfg-bbzrz\"" Apr 24 19:40:29.261307 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.261292 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-6c0429-predictor-serving-cert\"" Apr 24 19:40:29.261379 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.261329 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\"" Apr 24 19:40:29.274802 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.274780 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/1.log" Apr 24 19:40:29.274921 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.274860 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:29.276003 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.275977 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:29.369799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.369698 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2kg8x\" (UniqueName: \"kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x\") pod \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " Apr 24 19:40:29.369799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.369758 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls\") pod \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " Apr 24 19:40:29.369799 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.369799 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert\") pod \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " Apr 24 19:40:29.370092 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.369893 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\") pod \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " Apr 24 19:40:29.370092 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.369947 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location\") pod \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\" (UID: \"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d\") " Apr 24 19:40:29.370092 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370067 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.370262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370119 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfmc5\" (UniqueName: \"kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.370262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370231 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" (UID: "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:40:29.370262 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370238 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" (UID: "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:40:29.370393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370252 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-a07d9f-kube-rbac-proxy-sar-config") pod "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" (UID: "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d"). InnerVolumeSpecName "isvc-secondary-a07d9f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:40:29.370393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370262 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.370393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370335 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.370393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370361 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.370541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370404 2564 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-cabundle-cert\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:29.370541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370439 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-isvc-secondary-a07d9f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:29.370541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.370461 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:29.372052 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.372029 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x" (OuterVolumeSpecName: "kube-api-access-2kg8x") pod "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" (UID: "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d"). InnerVolumeSpecName "kube-api-access-2kg8x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:40:29.372125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.372059 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" (UID: "9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:40:29.471097 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471097 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471106 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471377 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471377 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471164 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471377 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfmc5\" (UniqueName: \"kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471377 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471253 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2kg8x\" (UniqueName: \"kubernetes.io/projected/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-kube-api-access-2kg8x\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:29.471377 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471269 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:29.471694 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471607 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471864 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.471992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.471972 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.473712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.473694 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.478895 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.478877 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfmc5\" (UniqueName: \"kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5\") pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.572815 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.572769 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:29.695236 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:29.695200 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:29.698236 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:40:29.698208 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9756a3f5_8098_40e4_b4ee_3e1a878ed1bd.slice/crio-44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75 WatchSource:0}: Error finding container 44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75: Status 404 returned error can't find the container with id 44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75 Apr 24 19:40:30.003825 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.003783 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerStarted","Data":"d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f"} Apr 24 19:40:30.003825 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.003824 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerStarted","Data":"44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75"} Apr 24 19:40:30.005608 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.005580 2564 generic.go:358] "Generic (PLEG): container finished" podID="494037ef-4514-4b97-b93f-7df80965ae7e" containerID="4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533" exitCode=2 Apr 24 19:40:30.005769 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.005656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerDied","Data":"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533"} Apr 24 19:40:30.006671 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.006653 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m_9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/storage-initializer/1.log" Apr 24 19:40:30.006782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.006723 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" event={"ID":"9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d","Type":"ContainerDied","Data":"a64ab49c09260c9d5e0d66104c9d3ac08f1045c91f2598dfdfdca12551b62526"} Apr 24 19:40:30.006782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.006746 2564 scope.go:117] "RemoveContainer" containerID="bf7302462465147c302767ff3d1c4169a7ece798c512a572d32187d399c9859e" Apr 24 19:40:30.006782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.006752 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m" Apr 24 19:40:30.050152 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.050115 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:30.055057 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:30.055027 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a07d9f-predictor-5b8fdf86f6-29w2m"] Apr 24 19:40:31.184730 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:31.184696 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" path="/var/lib/kubelet/pods/9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d/volumes" Apr 24 19:40:33.597538 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.597515 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:40:33.704035 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.703380 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") pod \"494037ef-4514-4b97-b93f-7df80965ae7e\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " Apr 24 19:40:33.704035 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.703469 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location\") pod \"494037ef-4514-4b97-b93f-7df80965ae7e\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " Apr 24 19:40:33.704035 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.703506 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config\") pod \"494037ef-4514-4b97-b93f-7df80965ae7e\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " Apr 24 19:40:33.704035 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.703596 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d85h6\" (UniqueName: \"kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6\") pod \"494037ef-4514-4b97-b93f-7df80965ae7e\" (UID: \"494037ef-4514-4b97-b93f-7df80965ae7e\") " Apr 24 19:40:33.707235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.706918 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6" (OuterVolumeSpecName: "kube-api-access-d85h6") pod "494037ef-4514-4b97-b93f-7df80965ae7e" (UID: "494037ef-4514-4b97-b93f-7df80965ae7e"). InnerVolumeSpecName "kube-api-access-d85h6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:40:33.708721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.708688 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "494037ef-4514-4b97-b93f-7df80965ae7e" (UID: "494037ef-4514-4b97-b93f-7df80965ae7e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:40:33.708869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.708738 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-a07d9f-kube-rbac-proxy-sar-config") pod "494037ef-4514-4b97-b93f-7df80965ae7e" (UID: "494037ef-4514-4b97-b93f-7df80965ae7e"). InnerVolumeSpecName "isvc-primary-a07d9f-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:40:33.709269 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.709251 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "494037ef-4514-4b97-b93f-7df80965ae7e" (UID: "494037ef-4514-4b97-b93f-7df80965ae7e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:40:33.804712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.804672 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d85h6\" (UniqueName: \"kubernetes.io/projected/494037ef-4514-4b97-b93f-7df80965ae7e-kube-api-access-d85h6\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:33.804712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.804706 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/494037ef-4514-4b97-b93f-7df80965ae7e-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:33.804712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.804716 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/494037ef-4514-4b97-b93f-7df80965ae7e-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:33.804968 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:33.804726 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-a07d9f-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/494037ef-4514-4b97-b93f-7df80965ae7e-isvc-primary-a07d9f-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:34.019905 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.019817 2564 generic.go:358] "Generic (PLEG): container finished" podID="494037ef-4514-4b97-b93f-7df80965ae7e" containerID="9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf" exitCode=0 Apr 24 19:40:34.020028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.019894 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerDied","Data":"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf"} Apr 24 19:40:34.020028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.019938 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" event={"ID":"494037ef-4514-4b97-b93f-7df80965ae7e","Type":"ContainerDied","Data":"02d048df1af5e8cbfe1287da1879c50ee3f0d0cb52387eb48c40c402b3f42e27"} Apr 24 19:40:34.020028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.019938 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq" Apr 24 19:40:34.020028 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.020004 2564 scope.go:117] "RemoveContainer" containerID="4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533" Apr 24 19:40:34.028173 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.028155 2564 scope.go:117] "RemoveContainer" containerID="9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf" Apr 24 19:40:34.035161 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.035144 2564 scope.go:117] "RemoveContainer" containerID="36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0" Apr 24 19:40:34.041599 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.041575 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:40:34.042618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.042605 2564 scope.go:117] "RemoveContainer" containerID="4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533" Apr 24 19:40:34.042884 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:34.042867 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533\": container with ID starting with 4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533 not found: ID does not exist" containerID="4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533" Apr 24 19:40:34.042970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.042893 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533"} err="failed to get container status \"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533\": rpc error: code = NotFound desc = could not find container \"4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533\": container with ID starting with 4a155f159eb5b24f55c1e6e30b8256e0b997e3cde1eb859fea453bee09b53533 not found: ID does not exist" Apr 24 19:40:34.042970 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.042911 2564 scope.go:117] "RemoveContainer" containerID="9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf" Apr 24 19:40:34.043179 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:34.043160 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf\": container with ID starting with 9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf not found: ID does not exist" containerID="9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf" Apr 24 19:40:34.043223 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.043187 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf"} err="failed to get container status \"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf\": rpc error: code = NotFound desc = could not find container \"9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf\": container with ID starting with 9340e86565767aab203032b8d131e5e5d0f076f694ad6d807ead7749b85c34cf not found: ID does not exist" Apr 24 19:40:34.043223 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.043207 2564 scope.go:117] "RemoveContainer" containerID="36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0" Apr 24 19:40:34.043443 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:34.043427 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0\": container with ID starting with 36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0 not found: ID does not exist" containerID="36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0" Apr 24 19:40:34.043480 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.043450 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0"} err="failed to get container status \"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0\": rpc error: code = NotFound desc = could not find container \"36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0\": container with ID starting with 36c77322d795633adc348dce107e95bbc4a737da90368fe22813ac2f25692ed0 not found: ID does not exist" Apr 24 19:40:34.045435 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:34.045414 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a07d9f-predictor-86fbdc74c-w4xtq"] Apr 24 19:40:35.024803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:35.024775 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/0.log" Apr 24 19:40:35.025229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:35.024811 2564 generic.go:358] "Generic (PLEG): container finished" podID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerID="d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f" exitCode=1 Apr 24 19:40:35.025229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:35.024883 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerDied","Data":"d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f"} Apr 24 19:40:35.186347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:35.186313 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" path="/var/lib/kubelet/pods/494037ef-4514-4b97-b93f-7df80965ae7e/volumes" Apr 24 19:40:36.030432 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:36.030399 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/0.log" Apr 24 19:40:36.030869 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:36.030518 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerStarted","Data":"8a9724784c058e11229c8c8d8aa54a45fc8fe5c1297561030a4923af0ac7c40b"} Apr 24 19:40:39.042315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.042287 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/1.log" Apr 24 19:40:39.042729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.042603 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/0.log" Apr 24 19:40:39.042729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.042633 2564 generic.go:358] "Generic (PLEG): container finished" podID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerID="8a9724784c058e11229c8c8d8aa54a45fc8fe5c1297561030a4923af0ac7c40b" exitCode=1 Apr 24 19:40:39.042729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.042713 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerDied","Data":"8a9724784c058e11229c8c8d8aa54a45fc8fe5c1297561030a4923af0ac7c40b"} Apr 24 19:40:39.042827 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.042762 2564 scope.go:117] "RemoveContainer" containerID="d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f" Apr 24 19:40:39.043145 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.043127 2564 scope.go:117] "RemoveContainer" containerID="d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f" Apr 24 19:40:39.052887 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:39.052856 2564 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_kserve-ci-e2e-test_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd_0 in pod sandbox 44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75 from index: no such id: 'd1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f'" containerID="d1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f" Apr 24 19:40:39.052964 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:39.052905 2564 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_kserve-ci-e2e-test_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd_0 in pod sandbox 44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75 from index: no such id: 'd1f59ae215a3933dee77605b3199ead5c29008177e8c57ed86dbd8149005381f'; Skipping pod \"isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_kserve-ci-e2e-test(9756a3f5-8098-40e4-b4ee-3e1a878ed1bd)\"" logger="UnhandledError" Apr 24 19:40:39.054265 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:39.054243 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_kserve-ci-e2e-test(9756a3f5-8098-40e4-b4ee-3e1a878ed1bd)\"" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" Apr 24 19:40:39.271385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.271346 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:39.364826 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.364726 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:40:39.365103 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365085 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.365180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365106 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.365180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365120 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.365180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365128 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.365180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365146 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kube-rbac-proxy" Apr 24 19:40:39.365180 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365155 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kube-rbac-proxy" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365183 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365191 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365208 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="storage-initializer" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365216 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="storage-initializer" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365285 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365300 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kserve-container" Apr 24 19:40:39.365434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365314 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="494037ef-4514-4b97-b93f-7df80965ae7e" containerName="kube-rbac-proxy" Apr 24 19:40:39.365807 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.365449 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ccd68f3-ff9a-4ee4-90f5-45614b48dc0d" containerName="storage-initializer" Apr 24 19:40:39.369747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.369724 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.372071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.372050 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 24 19:40:39.372195 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.372075 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 19:40:39.372195 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.372085 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-27j79\"" Apr 24 19:40:39.378846 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.378822 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:40:39.550839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.550798 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.551020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.550852 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.551020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.550926 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8nx6\" (UniqueName: \"kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.551020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.550953 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.652297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652209 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.652297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652253 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.652297 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652295 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m8nx6\" (UniqueName: \"kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.652611 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652325 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.652611 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:39.652455 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-serving-cert: secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 19:40:39.652611 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:40:39.652547 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls podName:6118c9e1-90e6-4c7a-8dfd-b9b57b38657f nodeName:}" failed. No retries permitted until 2026-04-24 19:40:40.152529832 +0000 UTC m=+2033.476773994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls") pod "isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" (UID: "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f") : secret "isvc-predictive-sklearn-predictor-serving-cert" not found Apr 24 19:40:39.652806 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652758 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.653018 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.652998 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:39.661328 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:39.661306 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8nx6\" (UniqueName: \"kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:40.046567 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.046525 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/1.log" Apr 24 19:40:40.157908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.157867 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:40.160462 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.160431 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:40.177847 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.177823 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/1.log" Apr 24 19:40:40.177964 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.177891 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:40.280644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.280608 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:40:40.359004 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.358973 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wfmc5\" (UniqueName: \"kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5\") pod \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " Apr 24 19:40:40.359166 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359063 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location\") pod \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " Apr 24 19:40:40.359166 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359102 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert\") pod \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " Apr 24 19:40:40.359166 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359136 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls\") pod \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " Apr 24 19:40:40.359345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359167 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\") pod \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\" (UID: \"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd\") " Apr 24 19:40:40.359406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359342 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" (UID: "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:40:40.359486 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359466 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:40.359739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359712 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-6c0429-kube-rbac-proxy-sar-config") pod "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" (UID: "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd"). InnerVolumeSpecName "isvc-init-fail-6c0429-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:40:40.359894 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.359876 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" (UID: "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:40:40.363001 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.362976 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5" (OuterVolumeSpecName: "kube-api-access-wfmc5") pod "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" (UID: "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd"). InnerVolumeSpecName "kube-api-access-wfmc5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:40:40.363122 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.363071 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" (UID: "9756a3f5-8098-40e4-b4ee-3e1a878ed1bd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:40:40.401594 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.401543 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:40:40.460132 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.460110 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-isvc-init-fail-6c0429-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:40.460203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.460134 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wfmc5\" (UniqueName: \"kubernetes.io/projected/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-kube-api-access-wfmc5\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:40.460203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.460146 2564 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-cabundle-cert\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:40.460203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:40.460156 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:40:41.050871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.050846 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k_9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/storage-initializer/1.log" Apr 24 19:40:41.051281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.050933 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" event={"ID":"9756a3f5-8098-40e4-b4ee-3e1a878ed1bd","Type":"ContainerDied","Data":"44625dc5cde11176972e5a98c70b809bdfd61c94627fa2717e18ecaa89633d75"} Apr 24 19:40:41.051281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.050972 2564 scope.go:117] "RemoveContainer" containerID="8a9724784c058e11229c8c8d8aa54a45fc8fe5c1297561030a4923af0ac7c40b" Apr 24 19:40:41.051281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.050983 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k" Apr 24 19:40:41.052617 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.052584 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerStarted","Data":"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc"} Apr 24 19:40:41.052730 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.052622 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerStarted","Data":"60cf3dd81877084fc93d3e4a69d19cd31b6e1e3864f5d98ca0b267a367033a89"} Apr 24 19:40:41.100168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.100134 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:41.104089 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.104058 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-6c0429-predictor-7fd6df44b6-nsl4k"] Apr 24 19:40:41.183997 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:41.183965 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" path="/var/lib/kubelet/pods/9756a3f5-8098-40e4-b4ee-3e1a878ed1bd/volumes" Apr 24 19:40:45.066729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:45.066691 2564 generic.go:358] "Generic (PLEG): container finished" podID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerID="d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc" exitCode=0 Apr 24 19:40:45.067114 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:40:45.066768 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerDied","Data":"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc"} Apr 24 19:41:03.121534 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:03.121499 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerStarted","Data":"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4"} Apr 24 19:41:03.122005 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:03.121543 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerStarted","Data":"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b"} Apr 24 19:41:03.122005 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:03.121768 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:41:03.140943 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:03.140887 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podStartSLOduration=6.522642709 podStartE2EDuration="24.140872076s" podCreationTimestamp="2026-04-24 19:40:39 +0000 UTC" firstStartedPulling="2026-04-24 19:40:45.067985359 +0000 UTC m=+2038.392229500" lastFinishedPulling="2026-04-24 19:41:02.686214723 +0000 UTC m=+2056.010458867" observedRunningTime="2026-04-24 19:41:03.139945748 +0000 UTC m=+2056.464189912" watchObservedRunningTime="2026-04-24 19:41:03.140872076 +0000 UTC m=+2056.465116293" Apr 24 19:41:04.124419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:04.124389 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:41:04.125421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:04.125397 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:05.126892 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:05.126843 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:10.131010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:10.130933 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:41:10.131581 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:10.131528 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:20.132281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:20.132237 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:30.132173 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:30.132126 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:40.131692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:40.131636 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:41:50.131535 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:41:50.131490 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:42:00.131732 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:00.131688 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:42:10.131904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:10.131861 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:42:20.132413 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:20.132385 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:42:29.506739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.506704 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:42:29.507213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.507029 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" containerID="cri-o://2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b" gracePeriod=30 Apr 24 19:42:29.507213 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.507071 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kube-rbac-proxy" containerID="cri-o://6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4" gracePeriod=30 Apr 24 19:42:29.611294 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611258 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:42:29.611625 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611604 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.611625 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611624 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.611747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611686 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.611747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611699 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.611747 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611747 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.611852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.611753 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9756a3f5-8098-40e4-b4ee-3e1a878ed1bd" containerName="storage-initializer" Apr 24 19:42:29.614887 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.614863 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.617149 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.617128 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 24 19:42:29.617244 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.617156 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 19:42:29.627395 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.627368 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:42:29.781437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.781337 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cckh5\" (UniqueName: \"kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.781437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.781395 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.781437 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.781417 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.781707 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.781484 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882171 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882123 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cckh5\" (UniqueName: \"kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882387 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882387 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882205 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882387 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882242 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882712 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.882968 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.882945 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.884723 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.884691 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.890811 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.890783 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cckh5\" (UniqueName: \"kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-grt56\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:29.924611 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:29.924570 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:30.054324 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.054290 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:42:30.056935 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:42:30.056900 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode18ec68e_71a0_4f0c_8252_64094ac5a19d.slice/crio-19f2095f6f29ff6314a032ce2832971acd6e1746a7eb70704ebe5d001551f8c1 WatchSource:0}: Error finding container 19f2095f6f29ff6314a032ce2832971acd6e1746a7eb70704ebe5d001551f8c1: Status 404 returned error can't find the container with id 19f2095f6f29ff6314a032ce2832971acd6e1746a7eb70704ebe5d001551f8c1 Apr 24 19:42:30.058743 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.058725 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:42:30.127427 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.127385 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 24 19:42:30.131727 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.131695 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 24 19:42:30.365319 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.365221 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerStarted","Data":"57651b25ec712a5d599d88519edecfba5b602fca3e25ed247a0906c2450e4aaa"} Apr 24 19:42:30.365319 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.365265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerStarted","Data":"19f2095f6f29ff6314a032ce2832971acd6e1746a7eb70704ebe5d001551f8c1"} Apr 24 19:42:30.367152 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.367127 2564 generic.go:358] "Generic (PLEG): container finished" podID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerID="6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4" exitCode=2 Apr 24 19:42:30.367268 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:30.367207 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerDied","Data":"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4"} Apr 24 19:42:34.379301 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.379265 2564 generic.go:358] "Generic (PLEG): container finished" podID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerID="57651b25ec712a5d599d88519edecfba5b602fca3e25ed247a0906c2450e4aaa" exitCode=0 Apr 24 19:42:34.379301 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.379305 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerDied","Data":"57651b25ec712a5d599d88519edecfba5b602fca3e25ed247a0906c2450e4aaa"} Apr 24 19:42:34.651814 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.651791 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:42:34.823987 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.823946 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") pod \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " Apr 24 19:42:34.824151 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.824014 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location\") pod \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " Apr 24 19:42:34.824151 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.824084 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8nx6\" (UniqueName: \"kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6\") pod \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " Apr 24 19:42:34.824151 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.824115 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\" (UID: \"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f\") " Apr 24 19:42:34.828667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.824275 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" (UID: "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:42:34.828667 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.824821 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" (UID: "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:42:34.830692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.830660 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6" (OuterVolumeSpecName: "kube-api-access-m8nx6") pod "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" (UID: "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f"). InnerVolumeSpecName "kube-api-access-m8nx6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:42:34.830808 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.830657 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" (UID: "6118c9e1-90e6-4c7a-8dfd-b9b57b38657f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:42:34.924824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.924747 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:42:34.924824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.924782 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:42:34.924824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.924794 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m8nx6\" (UniqueName: \"kubernetes.io/projected/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-kube-api-access-m8nx6\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:42:34.924824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:34.924804 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:42:35.384206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.384168 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerStarted","Data":"182ea73ac431033be791ca3c75da36afdad1b2577da714ce700b41bb2f9a10b2"} Apr 24 19:42:35.384656 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.384214 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerStarted","Data":"9539cbc39bbd3b5126d55b95ea38d0e90b9279a1a0ee601300222ed4b578fc1e"} Apr 24 19:42:35.384656 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.384571 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:35.384782 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.384675 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:35.387512 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.387481 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:42:35.388499 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.388472 2564 generic.go:358] "Generic (PLEG): container finished" podID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerID="2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b" exitCode=0 Apr 24 19:42:35.388636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.388570 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerDied","Data":"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b"} Apr 24 19:42:35.388636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.388611 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" event={"ID":"6118c9e1-90e6-4c7a-8dfd-b9b57b38657f","Type":"ContainerDied","Data":"60cf3dd81877084fc93d3e4a69d19cd31b6e1e3864f5d98ca0b267a367033a89"} Apr 24 19:42:35.388636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.388613 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5" Apr 24 19:42:35.388791 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.388624 2564 scope.go:117] "RemoveContainer" containerID="6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4" Apr 24 19:42:35.396065 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.396042 2564 scope.go:117] "RemoveContainer" containerID="2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b" Apr 24 19:42:35.403139 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.403119 2564 scope.go:117] "RemoveContainer" containerID="d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc" Apr 24 19:42:35.410329 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.410310 2564 scope.go:117] "RemoveContainer" containerID="6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4" Apr 24 19:42:35.410607 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:42:35.410589 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4\": container with ID starting with 6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4 not found: ID does not exist" containerID="6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4" Apr 24 19:42:35.410661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.410614 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4"} err="failed to get container status \"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4\": rpc error: code = NotFound desc = could not find container \"6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4\": container with ID starting with 6882cb4e850e433ed6af9339a0ce8777467aa592d534b695a99d5bef7c1707e4 not found: ID does not exist" Apr 24 19:42:35.410661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.410632 2564 scope.go:117] "RemoveContainer" containerID="2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b" Apr 24 19:42:35.410877 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:42:35.410859 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b\": container with ID starting with 2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b not found: ID does not exist" containerID="2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b" Apr 24 19:42:35.410925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.410882 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b"} err="failed to get container status \"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b\": rpc error: code = NotFound desc = could not find container \"2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b\": container with ID starting with 2469d83f150f77cff20b2d1e22aaa9b9af5def6dbee030f290bc0b7969f8c57b not found: ID does not exist" Apr 24 19:42:35.410925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.410898 2564 scope.go:117] "RemoveContainer" containerID="d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc" Apr 24 19:42:35.411084 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:42:35.411067 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc\": container with ID starting with d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc not found: ID does not exist" containerID="d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc" Apr 24 19:42:35.411125 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.411091 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc"} err="failed to get container status \"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc\": rpc error: code = NotFound desc = could not find container \"d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc\": container with ID starting with d40de45dde1142f86646a1f8dab75604fe8967289e29db299309f4b8b98bb4cc not found: ID does not exist" Apr 24 19:42:35.423198 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.423157 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podStartSLOduration=6.423142692 podStartE2EDuration="6.423142692s" podCreationTimestamp="2026-04-24 19:42:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:42:35.408382209 +0000 UTC m=+2148.732626372" watchObservedRunningTime="2026-04-24 19:42:35.423142692 +0000 UTC m=+2148.747386859" Apr 24 19:42:35.424206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.424183 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:42:35.433335 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:35.433314 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-9w5s5"] Apr 24 19:42:36.392917 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:36.392835 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:42:37.184918 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:37.184879 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" path="/var/lib/kubelet/pods/6118c9e1-90e6-4c7a-8dfd-b9b57b38657f/volumes" Apr 24 19:42:41.397833 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:41.397804 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:42:41.398461 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:41.398433 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:42:51.398596 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:42:51.398531 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:01.398349 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:01.398309 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:11.399018 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:11.398976 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:21.398401 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:21.398357 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:31.398342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:31.398300 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:41.398756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:41.398708 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:43:51.399689 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:51.399653 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:43:59.715327 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.715292 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:43:59.715761 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.715668 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" containerID="cri-o://9539cbc39bbd3b5126d55b95ea38d0e90b9279a1a0ee601300222ed4b578fc1e" gracePeriod=30 Apr 24 19:43:59.715832 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.715719 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kube-rbac-proxy" containerID="cri-o://182ea73ac431033be791ca3c75da36afdad1b2577da714ce700b41bb2f9a10b2" gracePeriod=30 Apr 24 19:43:59.825955 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.825908 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826253 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kube-rbac-proxy" Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826266 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kube-rbac-proxy" Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826285 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="storage-initializer" Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826291 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="storage-initializer" Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826297 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" Apr 24 19:43:59.826347 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826303 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" Apr 24 19:43:59.826623 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826357 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kube-rbac-proxy" Apr 24 19:43:59.826623 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.826368 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6118c9e1-90e6-4c7a-8dfd-b9b57b38657f" containerName="kserve-container" Apr 24 19:43:59.833603 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.833569 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:43:59.836141 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.836113 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 24 19:43:59.836337 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.836314 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 24 19:43:59.839790 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.839765 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:43:59.973277 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.973175 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:43:59.973277 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.973237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:43:59.973481 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.973307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:43:59.973481 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:43:59.973335 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.073827 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.073776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.074078 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.073849 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.074078 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.073892 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.074078 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.073931 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.074293 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.074276 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.074526 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.074506 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.076300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.076284 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.082010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.081984 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.145378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.145335 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:00.272399 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.272266 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:44:00.275155 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:44:00.275126 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fcaafbc_57fa_430a_b0f7_1893ae7185ba.slice/crio-4f79598c215dcff78675e9bc2079538eac0f71c83d38e42ec50a70d794ec5d77 WatchSource:0}: Error finding container 4f79598c215dcff78675e9bc2079538eac0f71c83d38e42ec50a70d794ec5d77: Status 404 returned error can't find the container with id 4f79598c215dcff78675e9bc2079538eac0f71c83d38e42ec50a70d794ec5d77 Apr 24 19:44:00.627678 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.627590 2564 generic.go:358] "Generic (PLEG): container finished" podID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerID="182ea73ac431033be791ca3c75da36afdad1b2577da714ce700b41bb2f9a10b2" exitCode=2 Apr 24 19:44:00.627678 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.627662 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerDied","Data":"182ea73ac431033be791ca3c75da36afdad1b2577da714ce700b41bb2f9a10b2"} Apr 24 19:44:00.628908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.628882 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerStarted","Data":"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00"} Apr 24 19:44:00.629056 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:00.628912 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerStarted","Data":"4f79598c215dcff78675e9bc2079538eac0f71c83d38e42ec50a70d794ec5d77"} Apr 24 19:44:01.393616 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:01.393575 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 24 19:44:01.398962 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:01.398932 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 24 19:44:04.643098 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.643061 2564 generic.go:358] "Generic (PLEG): container finished" podID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerID="9539cbc39bbd3b5126d55b95ea38d0e90b9279a1a0ee601300222ed4b578fc1e" exitCode=0 Apr 24 19:44:04.643547 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.643099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerDied","Data":"9539cbc39bbd3b5126d55b95ea38d0e90b9279a1a0ee601300222ed4b578fc1e"} Apr 24 19:44:04.644449 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.644428 2564 generic.go:358] "Generic (PLEG): container finished" podID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerID="00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00" exitCode=0 Apr 24 19:44:04.644596 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.644477 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerDied","Data":"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00"} Apr 24 19:44:04.682445 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.682420 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:44:04.813253 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813160 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls\") pod \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " Apr 24 19:44:04.813253 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813219 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cckh5\" (UniqueName: \"kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5\") pod \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " Apr 24 19:44:04.813253 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813248 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " Apr 24 19:44:04.813532 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813285 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location\") pod \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\" (UID: \"e18ec68e-71a0-4f0c-8252-64094ac5a19d\") " Apr 24 19:44:04.813688 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813658 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "e18ec68e-71a0-4f0c-8252-64094ac5a19d" (UID: "e18ec68e-71a0-4f0c-8252-64094ac5a19d"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:44:04.813784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.813701 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e18ec68e-71a0-4f0c-8252-64094ac5a19d" (UID: "e18ec68e-71a0-4f0c-8252-64094ac5a19d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:44:04.815340 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.815321 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e18ec68e-71a0-4f0c-8252-64094ac5a19d" (UID: "e18ec68e-71a0-4f0c-8252-64094ac5a19d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:44:04.815406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.815349 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5" (OuterVolumeSpecName: "kube-api-access-cckh5") pod "e18ec68e-71a0-4f0c-8252-64094ac5a19d" (UID: "e18ec68e-71a0-4f0c-8252-64094ac5a19d"). InnerVolumeSpecName "kube-api-access-cckh5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:44:04.913946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.913909 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cckh5\" (UniqueName: \"kubernetes.io/projected/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kube-api-access-cckh5\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:44:04.913946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.913945 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e18ec68e-71a0-4f0c-8252-64094ac5a19d-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:44:04.913946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.913957 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e18ec68e-71a0-4f0c-8252-64094ac5a19d-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:44:04.914204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:04.913967 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e18ec68e-71a0-4f0c-8252-64094ac5a19d-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:44:05.648912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.648875 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" event={"ID":"e18ec68e-71a0-4f0c-8252-64094ac5a19d","Type":"ContainerDied","Data":"19f2095f6f29ff6314a032ce2832971acd6e1746a7eb70704ebe5d001551f8c1"} Apr 24 19:44:05.648912 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.648920 2564 scope.go:117] "RemoveContainer" containerID="182ea73ac431033be791ca3c75da36afdad1b2577da714ce700b41bb2f9a10b2" Apr 24 19:44:05.649429 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.648953 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56" Apr 24 19:44:05.650940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.650908 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerStarted","Data":"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053"} Apr 24 19:44:05.651059 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.650956 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerStarted","Data":"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9"} Apr 24 19:44:05.651306 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.651288 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:05.651390 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.651317 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:05.652998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.652966 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:05.659305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.657394 2564 scope.go:117] "RemoveContainer" containerID="9539cbc39bbd3b5126d55b95ea38d0e90b9279a1a0ee601300222ed4b578fc1e" Apr 24 19:44:05.666199 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.666174 2564 scope.go:117] "RemoveContainer" containerID="57651b25ec712a5d599d88519edecfba5b602fca3e25ed247a0906c2450e4aaa" Apr 24 19:44:05.666281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.666198 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:44:05.671598 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.671528 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-grt56"] Apr 24 19:44:05.689372 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:05.689317 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podStartSLOduration=6.689297095 podStartE2EDuration="6.689297095s" podCreationTimestamp="2026-04-24 19:43:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:44:05.687188733 +0000 UTC m=+2239.011432896" watchObservedRunningTime="2026-04-24 19:44:05.689297095 +0000 UTC m=+2239.013541258" Apr 24 19:44:06.655326 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:06.655249 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:07.184146 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:07.184113 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" path="/var/lib/kubelet/pods/e18ec68e-71a0-4f0c-8252-64094ac5a19d/volumes" Apr 24 19:44:11.661499 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:11.661469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:44:11.662089 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:11.662060 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:21.662463 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:21.662420 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:31.662332 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:31.662283 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:41.662424 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:41.662380 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:44:51.662729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:44:51.662680 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:45:01.662011 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:01.661972 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:45:11.662371 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:11.662328 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:45:21.663431 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:21.663400 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:45:29.978217 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:29.978178 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:45:29.978718 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:29.978625 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" containerID="cri-o://06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9" gracePeriod=30 Apr 24 19:45:29.978796 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:29.978700 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kube-rbac-proxy" containerID="cri-o://9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053" gracePeriod=30 Apr 24 19:45:30.086750 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.086717 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:45:30.087000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.086987 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="storage-initializer" Apr 24 19:45:30.087000 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087000 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="storage-initializer" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087009 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kube-rbac-proxy" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087015 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kube-rbac-proxy" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087030 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087036 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087092 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kserve-container" Apr 24 19:45:30.087126 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.087100 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e18ec68e-71a0-4f0c-8252-64094ac5a19d" containerName="kube-rbac-proxy" Apr 24 19:45:30.090345 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.090310 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.092837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.092814 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 24 19:45:30.093172 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.093150 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 19:45:30.100199 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.100172 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:45:30.242896 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.242780 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.242896 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.242843 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.243121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.242930 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.243121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.243077 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzvxx\" (UniqueName: \"kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.343991 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.343952 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzvxx\" (UniqueName: \"kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.343991 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.343993 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.344250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.344022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.344250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.344052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.344250 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:45:30.344152 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-serving-cert: secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 19:45:30.344250 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:45:30.344232 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls podName:1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c nodeName:}" failed. No retries permitted until 2026-04-24 19:45:30.844200465 +0000 UTC m=+2324.168444607 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls") pod "isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" (UID: "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c") : secret "isvc-predictive-sklearn-v2-predictor-serving-cert" not found Apr 24 19:45:30.344483 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.344466 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.344750 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.344731 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.354856 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.354831 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzvxx\" (UniqueName: \"kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.848397 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.848358 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.850839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.850804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:30.887541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.887508 2564 generic.go:358] "Generic (PLEG): container finished" podID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerID="9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053" exitCode=2 Apr 24 19:45:30.887737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:30.887574 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerDied","Data":"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053"} Apr 24 19:45:31.001904 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.001850 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:31.122845 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.122766 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:45:31.125931 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:45:31.125900 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1fc2d6f9_66d7_4ea9_81ab_1fffbb55a09c.slice/crio-4d88b60b29e146bcf2de36e5ddf1e361ba593e19e2106ea2686aac07ad31b921 WatchSource:0}: Error finding container 4d88b60b29e146bcf2de36e5ddf1e361ba593e19e2106ea2686aac07ad31b921: Status 404 returned error can't find the container with id 4d88b60b29e146bcf2de36e5ddf1e361ba593e19e2106ea2686aac07ad31b921 Apr 24 19:45:31.656170 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.656123 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 24 19:45:31.662643 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.662609 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 24 19:45:31.891907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.891862 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerStarted","Data":"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1"} Apr 24 19:45:31.891907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:31.891911 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerStarted","Data":"4d88b60b29e146bcf2de36e5ddf1e361ba593e19e2106ea2686aac07ad31b921"} Apr 24 19:45:35.417522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.417497 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:45:35.491128 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491090 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls\") pod \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " Apr 24 19:45:35.491305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491167 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " Apr 24 19:45:35.491305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491198 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location\") pod \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " Apr 24 19:45:35.491305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491281 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r\") pod \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\" (UID: \"2fcaafbc-57fa-430a-b0f7-1893ae7185ba\") " Apr 24 19:45:35.491580 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491532 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "2fcaafbc-57fa-430a-b0f7-1893ae7185ba" (UID: "2fcaafbc-57fa-430a-b0f7-1893ae7185ba"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:45:35.491686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.491580 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2fcaafbc-57fa-430a-b0f7-1893ae7185ba" (UID: "2fcaafbc-57fa-430a-b0f7-1893ae7185ba"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:45:35.493198 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.493180 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fcaafbc-57fa-430a-b0f7-1893ae7185ba" (UID: "2fcaafbc-57fa-430a-b0f7-1893ae7185ba"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:45:35.493338 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.493317 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r" (OuterVolumeSpecName: "kube-api-access-7lq2r") pod "2fcaafbc-57fa-430a-b0f7-1893ae7185ba" (UID: "2fcaafbc-57fa-430a-b0f7-1893ae7185ba"). InnerVolumeSpecName "kube-api-access-7lq2r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:45:35.592204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.592105 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7lq2r\" (UniqueName: \"kubernetes.io/projected/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kube-api-access-7lq2r\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:45:35.592204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.592139 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:45:35.592204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.592154 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:45:35.592204 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.592168 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fcaafbc-57fa-430a-b0f7-1893ae7185ba-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:45:35.907574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.907474 2564 generic.go:358] "Generic (PLEG): container finished" podID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerID="06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9" exitCode=0 Apr 24 19:45:35.907742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.907573 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" Apr 24 19:45:35.907742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.907565 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerDied","Data":"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9"} Apr 24 19:45:35.907742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.907684 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh" event={"ID":"2fcaafbc-57fa-430a-b0f7-1893ae7185ba","Type":"ContainerDied","Data":"4f79598c215dcff78675e9bc2079538eac0f71c83d38e42ec50a70d794ec5d77"} Apr 24 19:45:35.907742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.907723 2564 scope.go:117] "RemoveContainer" containerID="9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053" Apr 24 19:45:35.909168 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.909133 2564 generic.go:358] "Generic (PLEG): container finished" podID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerID="415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1" exitCode=0 Apr 24 19:45:35.909314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.909190 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerDied","Data":"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1"} Apr 24 19:45:35.916713 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.916633 2564 scope.go:117] "RemoveContainer" containerID="06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9" Apr 24 19:45:35.924121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.924071 2564 scope.go:117] "RemoveContainer" containerID="00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00" Apr 24 19:45:35.937318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.937298 2564 scope.go:117] "RemoveContainer" containerID="9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053" Apr 24 19:45:35.937665 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:45:35.937643 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053\": container with ID starting with 9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053 not found: ID does not exist" containerID="9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053" Apr 24 19:45:35.937731 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.937674 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053"} err="failed to get container status \"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053\": rpc error: code = NotFound desc = could not find container \"9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053\": container with ID starting with 9f47ad406d85310d926efcc8da7c37bf80b1f6bc2c7ea2852ce57ac31531d053 not found: ID does not exist" Apr 24 19:45:35.937731 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.937693 2564 scope.go:117] "RemoveContainer" containerID="06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9" Apr 24 19:45:35.937966 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:45:35.937951 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9\": container with ID starting with 06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9 not found: ID does not exist" containerID="06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9" Apr 24 19:45:35.938015 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.937969 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9"} err="failed to get container status \"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9\": rpc error: code = NotFound desc = could not find container \"06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9\": container with ID starting with 06fc44d96f52bb465520d0786e15ac31d1c2a44d25eca9acb528b59cd06dbdf9 not found: ID does not exist" Apr 24 19:45:35.938015 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.937984 2564 scope.go:117] "RemoveContainer" containerID="00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00" Apr 24 19:45:35.938238 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:45:35.938220 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00\": container with ID starting with 00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00 not found: ID does not exist" containerID="00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00" Apr 24 19:45:35.938278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.938250 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00"} err="failed to get container status \"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00\": rpc error: code = NotFound desc = could not find container \"00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00\": container with ID starting with 00ecdf3620326b7ad33622126530e7e988235e218efa4fe96d7fac663f620b00 not found: ID does not exist" Apr 24 19:45:35.947721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.947690 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:45:35.949362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:35.949336 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-lcnhh"] Apr 24 19:45:36.918041 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:36.918001 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerStarted","Data":"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36"} Apr 24 19:45:36.918041 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:36.918042 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerStarted","Data":"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411"} Apr 24 19:45:36.918676 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:36.918385 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:36.937207 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:36.937154 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podStartSLOduration=6.937137677 podStartE2EDuration="6.937137677s" podCreationTimestamp="2026-04-24 19:45:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:45:36.935674915 +0000 UTC m=+2330.259919098" watchObservedRunningTime="2026-04-24 19:45:36.937137677 +0000 UTC m=+2330.261381842" Apr 24 19:45:37.184107 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:37.184028 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" path="/var/lib/kubelet/pods/2fcaafbc-57fa-430a-b0f7-1893ae7185ba/volumes" Apr 24 19:45:37.921632 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:37.921600 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:45:43.931121 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:45:43.931092 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:46:13.931930 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:13.931889 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:46:23.932610 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:23.932544 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:46:33.932318 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:33.932269 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:46:43.932618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:43.932574 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:46:45.180348 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:45.180296 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:46:55.184476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:46:55.184442 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:47:00.203432 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.203396 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:47:00.203913 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.203742 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" containerID="cri-o://e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411" gracePeriod=30 Apr 24 19:47:00.203913 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.203789 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kube-rbac-proxy" containerID="cri-o://eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36" gracePeriod=30 Apr 24 19:47:00.314998 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.314958 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:47:00.315290 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315275 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="storage-initializer" Apr 24 19:47:00.315330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315291 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="storage-initializer" Apr 24 19:47:00.315330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315303 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" Apr 24 19:47:00.315330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315308 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" Apr 24 19:47:00.315330 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315328 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kube-rbac-proxy" Apr 24 19:47:00.315503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315334 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kube-rbac-proxy" Apr 24 19:47:00.315503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315384 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kube-rbac-proxy" Apr 24 19:47:00.315503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.315393 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fcaafbc-57fa-430a-b0f7-1893ae7185ba" containerName="kserve-container" Apr 24 19:47:00.318480 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.318461 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.320618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.320600 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 19:47:00.320713 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.320634 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 24 19:47:00.328918 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.328890 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:47:00.471123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.471024 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.471123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.471083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8pm8\" (UniqueName: \"kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.471342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.471179 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.471342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.471301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.571769 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.571723 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.571977 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.571785 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.571977 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.571806 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8pm8\" (UniqueName: \"kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.571977 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.571833 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.572215 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.572191 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.572473 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.572453 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.574282 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.574260 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.580116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.580089 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8pm8\" (UniqueName: \"kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.628325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.628282 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:00.749847 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:00.749821 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:47:00.752213 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:47:00.752190 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb51790e_e145_4547_b4cf_603deb23e422.slice/crio-1fe8e6d838622139f821c30d97e6ba2d621e67e2f4a535c63850e5fc1fe43783 WatchSource:0}: Error finding container 1fe8e6d838622139f821c30d97e6ba2d621e67e2f4a535c63850e5fc1fe43783: Status 404 returned error can't find the container with id 1fe8e6d838622139f821c30d97e6ba2d621e67e2f4a535c63850e5fc1fe43783 Apr 24 19:47:01.158703 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:01.158599 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerStarted","Data":"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e"} Apr 24 19:47:01.158703 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:01.158641 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerStarted","Data":"1fe8e6d838622139f821c30d97e6ba2d621e67e2f4a535c63850e5fc1fe43783"} Apr 24 19:47:01.163260 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:01.163221 2564 generic.go:358] "Generic (PLEG): container finished" podID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerID="eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36" exitCode=2 Apr 24 19:47:01.163433 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:01.163266 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerDied","Data":"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36"} Apr 24 19:47:03.925462 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:03.925411 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 24 19:47:05.175804 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.175771 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb51790e-e145-4547-b4cf-603deb23e422" containerID="2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e" exitCode=0 Apr 24 19:47:05.176210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.175841 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerDied","Data":"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e"} Apr 24 19:47:05.180565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.180520 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 24 19:47:05.442925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.442900 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:47:05.509543 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.509500 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location\") pod \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " Apr 24 19:47:05.509736 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.509586 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") pod \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " Apr 24 19:47:05.509736 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.509606 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzvxx\" (UniqueName: \"kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx\") pod \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " Apr 24 19:47:05.509736 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.509627 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\" (UID: \"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c\") " Apr 24 19:47:05.509915 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.509888 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" (UID: "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:47:05.510128 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.510097 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" (UID: "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:47:05.511742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.511717 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" (UID: "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:47:05.511837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.511759 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx" (OuterVolumeSpecName: "kube-api-access-dzvxx") pod "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" (UID: "1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c"). InnerVolumeSpecName "kube-api-access-dzvxx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:47:05.610167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.610118 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:47:05.610167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.610158 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzvxx\" (UniqueName: \"kubernetes.io/projected/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kube-api-access-dzvxx\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:47:05.610167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.610170 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:47:05.610167 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:05.610181 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:47:06.181493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.181395 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerStarted","Data":"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc"} Apr 24 19:47:06.181493 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.181434 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerStarted","Data":"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0"} Apr 24 19:47:06.182037 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.181691 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:06.183056 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.183033 2564 generic.go:358] "Generic (PLEG): container finished" podID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerID="e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411" exitCode=0 Apr 24 19:47:06.183174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.183064 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerDied","Data":"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411"} Apr 24 19:47:06.183174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.183087 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" event={"ID":"1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c","Type":"ContainerDied","Data":"4d88b60b29e146bcf2de36e5ddf1e361ba593e19e2106ea2686aac07ad31b921"} Apr 24 19:47:06.183174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.183101 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh" Apr 24 19:47:06.183325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.183105 2564 scope.go:117] "RemoveContainer" containerID="eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36" Apr 24 19:47:06.191107 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.191082 2564 scope.go:117] "RemoveContainer" containerID="e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411" Apr 24 19:47:06.198162 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.198143 2564 scope.go:117] "RemoveContainer" containerID="415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1" Apr 24 19:47:06.201322 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.201281 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podStartSLOduration=6.201266883 podStartE2EDuration="6.201266883s" podCreationTimestamp="2026-04-24 19:47:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:47:06.19932089 +0000 UTC m=+2419.523565064" watchObservedRunningTime="2026-04-24 19:47:06.201266883 +0000 UTC m=+2419.525511046" Apr 24 19:47:06.206015 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.205994 2564 scope.go:117] "RemoveContainer" containerID="eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36" Apr 24 19:47:06.206264 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:47:06.206246 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36\": container with ID starting with eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36 not found: ID does not exist" containerID="eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36" Apr 24 19:47:06.206335 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.206277 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36"} err="failed to get container status \"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36\": rpc error: code = NotFound desc = could not find container \"eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36\": container with ID starting with eac67692c06bf6b4f74b0c80f70838a72af0ed0de123ac891a590b734b7b5b36 not found: ID does not exist" Apr 24 19:47:06.206335 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.206302 2564 scope.go:117] "RemoveContainer" containerID="e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411" Apr 24 19:47:06.206575 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:47:06.206538 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411\": container with ID starting with e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411 not found: ID does not exist" containerID="e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411" Apr 24 19:47:06.206672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.206580 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411"} err="failed to get container status \"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411\": rpc error: code = NotFound desc = could not find container \"e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411\": container with ID starting with e97fcc1a34dea090bf5c9a46d386f171f628986d9b8f3274c741d89945dd1411 not found: ID does not exist" Apr 24 19:47:06.206672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.206597 2564 scope.go:117] "RemoveContainer" containerID="415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1" Apr 24 19:47:06.206833 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:47:06.206814 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1\": container with ID starting with 415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1 not found: ID does not exist" containerID="415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1" Apr 24 19:47:06.206874 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.206838 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1"} err="failed to get container status \"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1\": rpc error: code = NotFound desc = could not find container \"415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1\": container with ID starting with 415bc1fe1bea2a2fca7ceaa7ea3699c4c5dbb74e74b03630a95374a9cf9150d1 not found: ID does not exist" Apr 24 19:47:06.211534 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.211514 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:47:06.214378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:06.214357 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-zzwsh"] Apr 24 19:47:07.185476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:07.185440 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" path="/var/lib/kubelet/pods/1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c/volumes" Apr 24 19:47:07.186070 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:07.186055 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:13.195199 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:13.195168 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:47:43.196367 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:43.196322 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 19:47:53.196154 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:47:53.196109 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 19:48:03.195822 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:03.195777 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 19:48:13.196618 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:13.196571 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 19:48:23.199658 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:23.199628 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:48:30.416235 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.416196 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:48:30.416669 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.416576 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" containerID="cri-o://f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0" gracePeriod=30 Apr 24 19:48:30.416669 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.416598 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kube-rbac-proxy" containerID="cri-o://794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc" gracePeriod=30 Apr 24 19:48:30.514949 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.514901 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:48:30.515311 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515265 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kube-rbac-proxy" Apr 24 19:48:30.515419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515314 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kube-rbac-proxy" Apr 24 19:48:30.515419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515340 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" Apr 24 19:48:30.515419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515348 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" Apr 24 19:48:30.515419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515363 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="storage-initializer" Apr 24 19:48:30.515419 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515372 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="storage-initializer" Apr 24 19:48:30.515713 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515451 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kube-rbac-proxy" Apr 24 19:48:30.515713 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.515464 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="1fc2d6f9-66d7-4ea9-81ab-1fffbb55a09c" containerName="kserve-container" Apr 24 19:48:30.518672 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.518648 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.520985 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.520964 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 24 19:48:30.521099 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.520988 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 24 19:48:30.528368 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.528339 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:48:30.652228 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.652194 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.652400 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.652238 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.652400 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.652310 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.652400 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.652342 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8sbs\" (UniqueName: \"kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753027 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753078 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753352 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753111 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753352 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753147 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8sbs\" (UniqueName: \"kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753613 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753592 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.753941 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.753920 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.755502 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.755482 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.761268 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.761242 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8sbs\" (UniqueName: \"kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.831175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.831133 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:30.958562 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.958354 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:48:30.961170 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:48:30.961143 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d3c16b3_0430_4053_abce_d9725a0a4dec.slice/crio-cc8c1a7039a83fc6a4c3564c4d3630c21457cb8f4a4f33d0591ebc17f80d10e6 WatchSource:0}: Error finding container cc8c1a7039a83fc6a4c3564c4d3630c21457cb8f4a4f33d0591ebc17f80d10e6: Status 404 returned error can't find the container with id cc8c1a7039a83fc6a4c3564c4d3630c21457cb8f4a4f33d0591ebc17f80d10e6 Apr 24 19:48:30.963057 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:30.963039 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:48:31.423061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:31.422956 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerStarted","Data":"655f0c42813232efd714830e03028b3f0d67b6393217912590f3196357fe5ebb"} Apr 24 19:48:31.423061 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:31.423001 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerStarted","Data":"cc8c1a7039a83fc6a4c3564c4d3630c21457cb8f4a4f33d0591ebc17f80d10e6"} Apr 24 19:48:31.425435 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:31.425407 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb51790e-e145-4547-b4cf-603deb23e422" containerID="794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc" exitCode=2 Apr 24 19:48:31.425581 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:31.425464 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerDied","Data":"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc"} Apr 24 19:48:33.189741 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:33.189699 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.44:8643/healthz\": dial tcp 10.132.0.44:8643: connect: connection refused" Apr 24 19:48:33.196360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:33.196325 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 24 19:48:35.366699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.366672 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:48:35.442994 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.442956 2564 generic.go:358] "Generic (PLEG): container finished" podID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerID="655f0c42813232efd714830e03028b3f0d67b6393217912590f3196357fe5ebb" exitCode=0 Apr 24 19:48:35.443156 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.443030 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerDied","Data":"655f0c42813232efd714830e03028b3f0d67b6393217912590f3196357fe5ebb"} Apr 24 19:48:35.444806 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.444757 2564 generic.go:358] "Generic (PLEG): container finished" podID="bb51790e-e145-4547-b4cf-603deb23e422" containerID="f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0" exitCode=0 Apr 24 19:48:35.444806 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.444794 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerDied","Data":"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0"} Apr 24 19:48:35.444988 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.444828 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" event={"ID":"bb51790e-e145-4547-b4cf-603deb23e422","Type":"ContainerDied","Data":"1fe8e6d838622139f821c30d97e6ba2d621e67e2f4a535c63850e5fc1fe43783"} Apr 24 19:48:35.444988 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.444849 2564 scope.go:117] "RemoveContainer" containerID="794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc" Apr 24 19:48:35.444988 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.444865 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk" Apr 24 19:48:35.452746 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.452726 2564 scope.go:117] "RemoveContainer" containerID="f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0" Apr 24 19:48:35.459841 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.459821 2564 scope.go:117] "RemoveContainer" containerID="2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e" Apr 24 19:48:35.470838 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.470814 2564 scope.go:117] "RemoveContainer" containerID="794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc" Apr 24 19:48:35.471517 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:48:35.471493 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc\": container with ID starting with 794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc not found: ID does not exist" containerID="794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc" Apr 24 19:48:35.471650 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.471529 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc"} err="failed to get container status \"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc\": rpc error: code = NotFound desc = could not find container \"794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc\": container with ID starting with 794c7c1f10e803b9a8d83673633d16073b4d4aefeb0faa4c019c8d1c7126e1dc not found: ID does not exist" Apr 24 19:48:35.471650 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.471577 2564 scope.go:117] "RemoveContainer" containerID="f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0" Apr 24 19:48:35.471934 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:48:35.471912 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0\": container with ID starting with f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0 not found: ID does not exist" containerID="f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0" Apr 24 19:48:35.471994 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.471940 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0"} err="failed to get container status \"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0\": rpc error: code = NotFound desc = could not find container \"f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0\": container with ID starting with f406390101f2722061e0fddcd1fd6a4f71dfd7709e6aabfc1f19a4f4170027d0 not found: ID does not exist" Apr 24 19:48:35.471994 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.471956 2564 scope.go:117] "RemoveContainer" containerID="2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e" Apr 24 19:48:35.472221 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:48:35.472201 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e\": container with ID starting with 2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e not found: ID does not exist" containerID="2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e" Apr 24 19:48:35.472275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.472226 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e"} err="failed to get container status \"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e\": rpc error: code = NotFound desc = could not find container \"2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e\": container with ID starting with 2b314db4354503a771280516dd45ef6b26a27e24681340a8f046b9d2ba59da0e not found: ID does not exist" Apr 24 19:48:35.489993 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.489965 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"bb51790e-e145-4547-b4cf-603deb23e422\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " Apr 24 19:48:35.490142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.490016 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8pm8\" (UniqueName: \"kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8\") pod \"bb51790e-e145-4547-b4cf-603deb23e422\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " Apr 24 19:48:35.490142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.490048 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls\") pod \"bb51790e-e145-4547-b4cf-603deb23e422\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " Apr 24 19:48:35.490142 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.490068 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location\") pod \"bb51790e-e145-4547-b4cf-603deb23e422\" (UID: \"bb51790e-e145-4547-b4cf-603deb23e422\") " Apr 24 19:48:35.490386 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.490356 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "bb51790e-e145-4547-b4cf-603deb23e422" (UID: "bb51790e-e145-4547-b4cf-603deb23e422"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:48:35.490441 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.490409 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bb51790e-e145-4547-b4cf-603deb23e422" (UID: "bb51790e-e145-4547-b4cf-603deb23e422"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:48:35.492156 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.492131 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8" (OuterVolumeSpecName: "kube-api-access-b8pm8") pod "bb51790e-e145-4547-b4cf-603deb23e422" (UID: "bb51790e-e145-4547-b4cf-603deb23e422"). InnerVolumeSpecName "kube-api-access-b8pm8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:48:35.492270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.492153 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "bb51790e-e145-4547-b4cf-603deb23e422" (UID: "bb51790e-e145-4547-b4cf-603deb23e422"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:48:35.591026 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.590984 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b8pm8\" (UniqueName: \"kubernetes.io/projected/bb51790e-e145-4547-b4cf-603deb23e422-kube-api-access-b8pm8\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:48:35.591221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.591096 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bb51790e-e145-4547-b4cf-603deb23e422-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:48:35.591221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.591114 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bb51790e-e145-4547-b4cf-603deb23e422-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:48:35.591221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.591130 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/bb51790e-e145-4547-b4cf-603deb23e422-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:48:35.766384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.766347 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:48:35.772859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:35.772830 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-mwljk"] Apr 24 19:48:36.449647 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:36.449608 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerStarted","Data":"4ca0c4341c0565cd5979fc3a65ef1a27b705e4c2144514af8220fa7a479c471f"} Apr 24 19:48:36.450128 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:36.449651 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerStarted","Data":"c8008ad254dacc452a316afe20976544b1c9b02829c61079f845a1665430344a"} Apr 24 19:48:36.450128 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:36.449895 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:36.450128 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:36.449924 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:48:36.470113 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:36.470055 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podStartSLOduration=6.470040282 podStartE2EDuration="6.470040282s" podCreationTimestamp="2026-04-24 19:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:48:36.469797864 +0000 UTC m=+2509.794042027" watchObservedRunningTime="2026-04-24 19:48:36.470040282 +0000 UTC m=+2509.794284444" Apr 24 19:48:37.185461 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:37.185420 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb51790e-e145-4547-b4cf-603deb23e422" path="/var/lib/kubelet/pods/bb51790e-e145-4547-b4cf-603deb23e422/volumes" Apr 24 19:48:42.459965 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:48:42.459935 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:49:12.460824 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:49:12.460774 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:49:22.460924 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:49:22.460885 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:49:32.460963 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:49:32.460922 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:49:42.461305 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:49:42.461254 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:49:52.463994 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:49:52.463958 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:50:00.598603 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:00.598564 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:50:00.599174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:00.598995 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" containerID="cri-o://c8008ad254dacc452a316afe20976544b1c9b02829c61079f845a1665430344a" gracePeriod=30 Apr 24 19:50:00.599174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:00.599022 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kube-rbac-proxy" containerID="cri-o://4ca0c4341c0565cd5979fc3a65ef1a27b705e4c2144514af8220fa7a479c471f" gracePeriod=30 Apr 24 19:50:01.686075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:01.686040 2564 generic.go:358] "Generic (PLEG): container finished" podID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerID="4ca0c4341c0565cd5979fc3a65ef1a27b705e4c2144514af8220fa7a479c471f" exitCode=2 Apr 24 19:50:01.686469 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:01.686116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerDied","Data":"4ca0c4341c0565cd5979fc3a65ef1a27b705e4c2144514af8220fa7a479c471f"} Apr 24 19:50:02.455680 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.455634 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 24 19:50:02.461087 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.461059 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 24 19:50:02.800569 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800475 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800785 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kube-rbac-proxy" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800799 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kube-rbac-proxy" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800830 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="storage-initializer" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800836 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="storage-initializer" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800870 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800879 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800924 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kube-rbac-proxy" Apr 24 19:50:02.800969 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.800934 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="bb51790e-e145-4547-b4cf-603deb23e422" containerName="kserve-container" Apr 24 19:50:02.803946 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.803928 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:02.806654 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.806632 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 24 19:50:02.806763 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.806695 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 24 19:50:02.820460 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.820431 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:50:02.948455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.948412 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:02.948455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.948449 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:02.948678 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.948488 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpnb\" (UniqueName: \"kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:02.948678 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:02.948570 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.049174 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.049129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.049350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.049211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.049350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.049229 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.049350 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.049260 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpnb\" (UniqueName: \"kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.049745 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.049723 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.050033 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.050016 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.051679 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.051638 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.058291 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.058263 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpnb\" (UniqueName: \"kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb\") pod \"isvc-sklearn-predictor-76f6b66fdb-pnjql\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.114209 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.114172 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:03.239546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.239434 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:50:03.242354 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:50:03.242327 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dde9a1_4bdf_4775_9b07_6048c4a95f4b.slice/crio-9dc2483681cb870da900df3179c9cdf397e873688be932e248de557002dc1664 WatchSource:0}: Error finding container 9dc2483681cb870da900df3179c9cdf397e873688be932e248de557002dc1664: Status 404 returned error can't find the container with id 9dc2483681cb870da900df3179c9cdf397e873688be932e248de557002dc1664 Apr 24 19:50:03.693764 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.693725 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerStarted","Data":"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97"} Apr 24 19:50:03.693764 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:03.693766 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerStarted","Data":"9dc2483681cb870da900df3179c9cdf397e873688be932e248de557002dc1664"} Apr 24 19:50:05.701619 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:05.701583 2564 generic.go:358] "Generic (PLEG): container finished" podID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerID="c8008ad254dacc452a316afe20976544b1c9b02829c61079f845a1665430344a" exitCode=0 Apr 24 19:50:05.701976 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:05.701656 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerDied","Data":"c8008ad254dacc452a316afe20976544b1c9b02829c61079f845a1665430344a"} Apr 24 19:50:05.939315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:05.939292 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:50:06.073893 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.073778 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls\") pod \"0d3c16b3-0430-4053-abce-d9725a0a4dec\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " Apr 24 19:50:06.073893 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.073853 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location\") pod \"0d3c16b3-0430-4053-abce-d9725a0a4dec\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " Apr 24 19:50:06.073893 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.073891 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8sbs\" (UniqueName: \"kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs\") pod \"0d3c16b3-0430-4053-abce-d9725a0a4dec\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " Apr 24 19:50:06.074139 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.073925 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"0d3c16b3-0430-4053-abce-d9725a0a4dec\" (UID: \"0d3c16b3-0430-4053-abce-d9725a0a4dec\") " Apr 24 19:50:06.074256 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.074236 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d3c16b3-0430-4053-abce-d9725a0a4dec" (UID: "0d3c16b3-0430-4053-abce-d9725a0a4dec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:50:06.074300 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.074274 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "0d3c16b3-0430-4053-abce-d9725a0a4dec" (UID: "0d3c16b3-0430-4053-abce-d9725a0a4dec"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:50:06.076069 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.076043 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs" (OuterVolumeSpecName: "kube-api-access-z8sbs") pod "0d3c16b3-0430-4053-abce-d9725a0a4dec" (UID: "0d3c16b3-0430-4053-abce-d9725a0a4dec"). InnerVolumeSpecName "kube-api-access-z8sbs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:50:06.076069 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.076054 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0d3c16b3-0430-4053-abce-d9725a0a4dec" (UID: "0d3c16b3-0430-4053-abce-d9725a0a4dec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:50:06.175275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.175225 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8sbs\" (UniqueName: \"kubernetes.io/projected/0d3c16b3-0430-4053-abce-d9725a0a4dec-kube-api-access-z8sbs\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:50:06.175275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.175270 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0d3c16b3-0430-4053-abce-d9725a0a4dec-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:50:06.175275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.175284 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0d3c16b3-0430-4053-abce-d9725a0a4dec-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:50:06.175275 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.175295 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d3c16b3-0430-4053-abce-d9725a0a4dec-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:50:06.706761 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.706728 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" event={"ID":"0d3c16b3-0430-4053-abce-d9725a0a4dec","Type":"ContainerDied","Data":"cc8c1a7039a83fc6a4c3564c4d3630c21457cb8f4a4f33d0591ebc17f80d10e6"} Apr 24 19:50:06.706761 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.706759 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk" Apr 24 19:50:06.707240 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.706773 2564 scope.go:117] "RemoveContainer" containerID="4ca0c4341c0565cd5979fc3a65ef1a27b705e4c2144514af8220fa7a479c471f" Apr 24 19:50:06.714914 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.714790 2564 scope.go:117] "RemoveContainer" containerID="c8008ad254dacc452a316afe20976544b1c9b02829c61079f845a1665430344a" Apr 24 19:50:06.721626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.721603 2564 scope.go:117] "RemoveContainer" containerID="655f0c42813232efd714830e03028b3f0d67b6393217912590f3196357fe5ebb" Apr 24 19:50:06.727949 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.727925 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:50:06.731577 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:06.731540 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-7xjvk"] Apr 24 19:50:07.183938 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:07.183905 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" path="/var/lib/kubelet/pods/0d3c16b3-0430-4053-abce-d9725a0a4dec/volumes" Apr 24 19:50:07.715673 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:07.715639 2564 generic.go:358] "Generic (PLEG): container finished" podID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerID="69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97" exitCode=0 Apr 24 19:50:07.716106 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:07.715712 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerDied","Data":"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97"} Apr 24 19:50:08.721172 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.721136 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerStarted","Data":"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753"} Apr 24 19:50:08.721172 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.721178 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerStarted","Data":"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d"} Apr 24 19:50:08.721645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.721469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:08.721645 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.721583 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:08.722825 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.722794 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:08.741565 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:08.741508 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podStartSLOduration=6.741496915 podStartE2EDuration="6.741496915s" podCreationTimestamp="2026-04-24 19:50:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:50:08.740206242 +0000 UTC m=+2602.064450402" watchObservedRunningTime="2026-04-24 19:50:08.741496915 +0000 UTC m=+2602.065741078" Apr 24 19:50:09.724414 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:09.724365 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:14.729132 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:14.729100 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:50:14.729763 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:14.729733 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:24.730421 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:24.730373 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:34.729698 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:34.729655 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:44.730190 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:44.730148 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:50:54.730375 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:50:54.730337 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:51:04.729707 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:04.729668 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:51:14.730702 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:14.730673 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:51:22.898127 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:22.898093 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:51:22.898503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:22.898412 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" containerID="cri-o://1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d" gracePeriod=30 Apr 24 19:51:22.898598 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:22.898473 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kube-rbac-proxy" containerID="cri-o://3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753" gracePeriod=30 Apr 24 19:51:23.019952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.019923 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:51:23.020210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020199 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="storage-initializer" Apr 24 19:51:23.020263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020211 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="storage-initializer" Apr 24 19:51:23.020263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020220 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kube-rbac-proxy" Apr 24 19:51:23.020263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020226 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kube-rbac-proxy" Apr 24 19:51:23.020263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020243 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" Apr 24 19:51:23.020263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020248 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" Apr 24 19:51:23.020431 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020298 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kube-rbac-proxy" Apr 24 19:51:23.020431 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.020309 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d3c16b3-0430-4053-abce-d9725a0a4dec" containerName="kserve-container" Apr 24 19:51:23.023232 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.023215 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.025598 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.025572 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 24 19:51:23.025729 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.025694 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 19:51:23.040467 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.040438 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:51:23.159030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.158941 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.159030 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.158995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd2tb\" (UniqueName: \"kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.159221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.159071 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.159221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.159117 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.260346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.260291 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.260346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.260359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jd2tb\" (UniqueName: \"kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.260661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.260393 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.260661 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.260422 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.260829 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.260801 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.261140 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.261112 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.263021 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.262997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.268648 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.268615 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd2tb\" (UniqueName: \"kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb\") pod \"sklearn-v2-mlserver-predictor-65d8664766-gmnpj\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.333143 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.333086 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:23.460692 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.460660 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:51:23.463998 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:51:23.463973 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c695fc_9520_4414_a48d_658077fb94c0.slice/crio-a103214385bc9da27869306a1b5f8385ff8a0430874525b879a9fb277120d0ae WatchSource:0}: Error finding container a103214385bc9da27869306a1b5f8385ff8a0430874525b879a9fb277120d0ae: Status 404 returned error can't find the container with id a103214385bc9da27869306a1b5f8385ff8a0430874525b879a9fb277120d0ae Apr 24 19:51:23.923287 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.923254 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerStarted","Data":"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1"} Apr 24 19:51:23.923796 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.923294 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerStarted","Data":"a103214385bc9da27869306a1b5f8385ff8a0430874525b879a9fb277120d0ae"} Apr 24 19:51:23.925205 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.925169 2564 generic.go:358] "Generic (PLEG): container finished" podID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerID="3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753" exitCode=2 Apr 24 19:51:23.925314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:23.925206 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerDied","Data":"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753"} Apr 24 19:51:24.725277 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:24.725224 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 24 19:51:24.729621 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:24.729588 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 24 19:51:27.355877 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.355850 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:51:27.497488 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497450 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls\") pod \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " Apr 24 19:51:27.497705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497516 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location\") pod \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " Apr 24 19:51:27.497705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497574 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " Apr 24 19:51:27.497705 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497635 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpnb\" (UniqueName: \"kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb\") pod \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\" (UID: \"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b\") " Apr 24 19:51:27.497910 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497885 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" (UID: "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:51:27.497983 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.497956 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" (UID: "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:51:27.499591 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.499565 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" (UID: "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:51:27.499706 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.499681 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb" (OuterVolumeSpecName: "kube-api-access-6tpnb") pod "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" (UID: "e7dde9a1-4bdf-4775-9b07-6048c4a95f4b"). InnerVolumeSpecName "kube-api-access-6tpnb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:51:27.598751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.598698 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6tpnb\" (UniqueName: \"kubernetes.io/projected/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kube-api-access-6tpnb\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:51:27.598751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.598745 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:51:27.598751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.598756 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:51:27.598751 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.598767 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:51:27.943681 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.943641 2564 generic.go:358] "Generic (PLEG): container finished" podID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerID="1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d" exitCode=0 Apr 24 19:51:27.943881 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.943729 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" Apr 24 19:51:27.943881 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.943755 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerDied","Data":"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d"} Apr 24 19:51:27.943881 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.943782 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql" event={"ID":"e7dde9a1-4bdf-4775-9b07-6048c4a95f4b","Type":"ContainerDied","Data":"9dc2483681cb870da900df3179c9cdf397e873688be932e248de557002dc1664"} Apr 24 19:51:27.943881 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.943798 2564 scope.go:117] "RemoveContainer" containerID="3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753" Apr 24 19:51:27.945161 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.945137 2564 generic.go:358] "Generic (PLEG): container finished" podID="f2c695fc-9520-4414-a48d-658077fb94c0" containerID="a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1" exitCode=0 Apr 24 19:51:27.945274 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.945186 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerDied","Data":"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1"} Apr 24 19:51:27.952267 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.952242 2564 scope.go:117] "RemoveContainer" containerID="1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d" Apr 24 19:51:27.959434 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.959406 2564 scope.go:117] "RemoveContainer" containerID="69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97" Apr 24 19:51:27.967232 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.967208 2564 scope.go:117] "RemoveContainer" containerID="3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753" Apr 24 19:51:27.967512 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:51:27.967493 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753\": container with ID starting with 3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753 not found: ID does not exist" containerID="3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753" Apr 24 19:51:27.967612 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.967521 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753"} err="failed to get container status \"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753\": rpc error: code = NotFound desc = could not find container \"3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753\": container with ID starting with 3c35bf49d4fd98470b0d6a6d30218a39d8e521b2d8fcc21c6757d47a6b118753 not found: ID does not exist" Apr 24 19:51:27.967612 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.967566 2564 scope.go:117] "RemoveContainer" containerID="1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d" Apr 24 19:51:27.967803 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:51:27.967786 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d\": container with ID starting with 1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d not found: ID does not exist" containerID="1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d" Apr 24 19:51:27.967861 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.967806 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d"} err="failed to get container status \"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d\": rpc error: code = NotFound desc = could not find container \"1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d\": container with ID starting with 1bdb115cb45bcb26d0cbc571cfc5c433b82f843a8a96db90892c0ff0327f596d not found: ID does not exist" Apr 24 19:51:27.967861 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.967820 2564 scope.go:117] "RemoveContainer" containerID="69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97" Apr 24 19:51:27.968038 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:51:27.968017 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97\": container with ID starting with 69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97 not found: ID does not exist" containerID="69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97" Apr 24 19:51:27.968093 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.968048 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97"} err="failed to get container status \"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97\": rpc error: code = NotFound desc = could not find container \"69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97\": container with ID starting with 69e0b5aee5a6286a6eba0942f7302c8564ec15440bf9fd9fa3aafab4396ffa97 not found: ID does not exist" Apr 24 19:51:27.978094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.978066 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:51:27.983475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:27.983441 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-76f6b66fdb-pnjql"] Apr 24 19:51:28.950315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:28.950280 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerStarted","Data":"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842"} Apr 24 19:51:28.950315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:28.950318 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerStarted","Data":"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca"} Apr 24 19:51:28.950739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:28.950508 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:28.950739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:28.950535 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:51:28.971768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:28.971706 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podStartSLOduration=6.971691633 podStartE2EDuration="6.971691633s" podCreationTimestamp="2026-04-24 19:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:51:28.970600269 +0000 UTC m=+2682.294844431" watchObservedRunningTime="2026-04-24 19:51:28.971691633 +0000 UTC m=+2682.295935796" Apr 24 19:51:29.186360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:29.186322 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" path="/var/lib/kubelet/pods/e7dde9a1-4bdf-4775-9b07-6048c4a95f4b/volumes" Apr 24 19:51:34.958752 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:51:34.958720 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:52:05.018072 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:05.018020 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 19:52:14.961344 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:14.961314 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:52:23.042314 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.042275 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:52:23.042775 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.042722 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" containerID="cri-o://57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca" gracePeriod=30 Apr 24 19:52:23.042858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.042793 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" containerID="cri-o://707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842" gracePeriod=30 Apr 24 19:52:23.137158 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137121 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:52:23.137410 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137398 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="storage-initializer" Apr 24 19:52:23.137454 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137411 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="storage-initializer" Apr 24 19:52:23.137454 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137434 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" Apr 24 19:52:23.137454 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137440 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" Apr 24 19:52:23.137454 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137448 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kube-rbac-proxy" Apr 24 19:52:23.137601 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137455 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kube-rbac-proxy" Apr 24 19:52:23.137601 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137500 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kube-rbac-proxy" Apr 24 19:52:23.137601 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.137510 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7dde9a1-4bdf-4775-9b07-6048c4a95f4b" containerName="kserve-container" Apr 24 19:52:23.140541 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.140520 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.142816 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.142795 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 24 19:52:23.143239 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.143217 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:52:23.150406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.150378 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:52:23.235130 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.235083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.235325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.235140 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.235325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.235166 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvzjq\" (UniqueName: \"kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.235325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.235237 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.336346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.336254 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.336346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.336326 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.336644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.336368 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.336644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.336394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvzjq\" (UniqueName: \"kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.336767 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.336745 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.337075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.337055 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.338933 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.338907 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.344695 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.344666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvzjq\" (UniqueName: \"kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq\") pod \"isvc-sklearn-runtime-predictor-777bccbf48-rzdnn\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.451332 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.451292 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:23.572542 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:23.572508 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:52:23.576918 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:52:23.576888 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89be8d83_3e69_4aff_beee_ea2f04dbf7b5.slice/crio-c0db65275b48785f3ae25c88f2916446b33f4c1bf545c7a68dad0af0a9c0888f WatchSource:0}: Error finding container c0db65275b48785f3ae25c88f2916446b33f4c1bf545c7a68dad0af0a9c0888f: Status 404 returned error can't find the container with id c0db65275b48785f3ae25c88f2916446b33f4c1bf545c7a68dad0af0a9c0888f Apr 24 19:52:24.117200 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:24.117165 2564 generic.go:358] "Generic (PLEG): container finished" podID="f2c695fc-9520-4414-a48d-658077fb94c0" containerID="707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842" exitCode=2 Apr 24 19:52:24.117698 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:24.117228 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerDied","Data":"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842"} Apr 24 19:52:24.118670 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:24.118647 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerStarted","Data":"a8dea13542f7a2e88203c485126c6b7b676441cb291ef1437b220e838f2a86bf"} Apr 24 19:52:24.118816 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:24.118674 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerStarted","Data":"c0db65275b48785f3ae25c88f2916446b33f4c1bf545c7a68dad0af0a9c0888f"} Apr 24 19:52:24.954385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:24.954344 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 24 19:52:26.000774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:26.000729 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/sklearn-v2-mlserver/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 19:52:29.953925 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:29.953874 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 24 19:52:30.137088 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:30.137052 2564 generic.go:358] "Generic (PLEG): container finished" podID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerID="a8dea13542f7a2e88203c485126c6b7b676441cb291ef1437b220e838f2a86bf" exitCode=0 Apr 24 19:52:30.137261 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:30.137098 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerDied","Data":"a8dea13542f7a2e88203c485126c6b7b676441cb291ef1437b220e838f2a86bf"} Apr 24 19:52:30.885651 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:30.885620 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:52:31.002333 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002246 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jd2tb\" (UniqueName: \"kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb\") pod \"f2c695fc-9520-4414-a48d-658077fb94c0\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " Apr 24 19:52:31.002333 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002299 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls\") pod \"f2c695fc-9520-4414-a48d-658077fb94c0\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " Apr 24 19:52:31.002333 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002325 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location\") pod \"f2c695fc-9520-4414-a48d-658077fb94c0\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " Apr 24 19:52:31.002902 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002348 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"f2c695fc-9520-4414-a48d-658077fb94c0\" (UID: \"f2c695fc-9520-4414-a48d-658077fb94c0\") " Apr 24 19:52:31.002902 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002734 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f2c695fc-9520-4414-a48d-658077fb94c0" (UID: "f2c695fc-9520-4414-a48d-658077fb94c0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:52:31.002902 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.002741 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "f2c695fc-9520-4414-a48d-658077fb94c0" (UID: "f2c695fc-9520-4414-a48d-658077fb94c0"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:52:31.004362 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.004339 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb" (OuterVolumeSpecName: "kube-api-access-jd2tb") pod "f2c695fc-9520-4414-a48d-658077fb94c0" (UID: "f2c695fc-9520-4414-a48d-658077fb94c0"). InnerVolumeSpecName "kube-api-access-jd2tb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:52:31.004471 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.004349 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f2c695fc-9520-4414-a48d-658077fb94c0" (UID: "f2c695fc-9520-4414-a48d-658077fb94c0"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:52:31.103220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.103179 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f2c695fc-9520-4414-a48d-658077fb94c0-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:52:31.103220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.103212 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c695fc-9520-4414-a48d-658077fb94c0-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:52:31.103220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.103223 2564 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f2c695fc-9520-4414-a48d-658077fb94c0-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:52:31.103220 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.103233 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jd2tb\" (UniqueName: \"kubernetes.io/projected/f2c695fc-9520-4414-a48d-658077fb94c0-kube-api-access-jd2tb\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:52:31.141986 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.141954 2564 generic.go:358] "Generic (PLEG): container finished" podID="f2c695fc-9520-4414-a48d-658077fb94c0" containerID="57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca" exitCode=0 Apr 24 19:52:31.142169 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.142042 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerDied","Data":"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca"} Apr 24 19:52:31.142169 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.142051 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" Apr 24 19:52:31.142169 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.142071 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj" event={"ID":"f2c695fc-9520-4414-a48d-658077fb94c0","Type":"ContainerDied","Data":"a103214385bc9da27869306a1b5f8385ff8a0430874525b879a9fb277120d0ae"} Apr 24 19:52:31.142169 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.142091 2564 scope.go:117] "RemoveContainer" containerID="707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842" Apr 24 19:52:31.144360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.144332 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerStarted","Data":"bcd78891c603e52b893f50310554b73ab1d921c65b0eb665448eaa7eadfc4407"} Apr 24 19:52:31.144480 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.144375 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerStarted","Data":"a6eb6912247ec7dc28fd24b7ed946e07963d8cbdeb36b13aa7f94390c8adf901"} Apr 24 19:52:31.144695 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.144680 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:31.151095 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.151070 2564 scope.go:117] "RemoveContainer" containerID="57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca" Apr 24 19:52:31.158636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.158614 2564 scope.go:117] "RemoveContainer" containerID="a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1" Apr 24 19:52:31.164639 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.164594 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podStartSLOduration=8.164577758 podStartE2EDuration="8.164577758s" podCreationTimestamp="2026-04-24 19:52:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:52:31.163185979 +0000 UTC m=+2744.487430153" watchObservedRunningTime="2026-04-24 19:52:31.164577758 +0000 UTC m=+2744.488821937" Apr 24 19:52:31.166406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.166381 2564 scope.go:117] "RemoveContainer" containerID="707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842" Apr 24 19:52:31.166691 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:52:31.166671 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842\": container with ID starting with 707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842 not found: ID does not exist" containerID="707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842" Apr 24 19:52:31.166771 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.166701 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842"} err="failed to get container status \"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842\": rpc error: code = NotFound desc = could not find container \"707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842\": container with ID starting with 707d05022da140f23eca3981034d35828a97d47338cb8733d2f5055497f5e842 not found: ID does not exist" Apr 24 19:52:31.166771 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.166721 2564 scope.go:117] "RemoveContainer" containerID="57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca" Apr 24 19:52:31.166970 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:52:31.166952 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca\": container with ID starting with 57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca not found: ID does not exist" containerID="57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca" Apr 24 19:52:31.167042 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.166981 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca"} err="failed to get container status \"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca\": rpc error: code = NotFound desc = could not find container \"57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca\": container with ID starting with 57ae95adccd9e30f90b69ea75d0c9da94d0321e27f2cc9ab75ada509570c3aca not found: ID does not exist" Apr 24 19:52:31.167042 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.167005 2564 scope.go:117] "RemoveContainer" containerID="a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1" Apr 24 19:52:31.167228 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:52:31.167212 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1\": container with ID starting with a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1 not found: ID does not exist" containerID="a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1" Apr 24 19:52:31.167281 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.167234 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1"} err="failed to get container status \"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1\": rpc error: code = NotFound desc = could not find container \"a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1\": container with ID starting with a328d4c6fb408e02a48528c9f3505be942e8259c41bfab978580a7e968add7c1 not found: ID does not exist" Apr 24 19:52:31.176325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.176296 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:52:31.179698 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.179674 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-gmnpj"] Apr 24 19:52:31.183841 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:31.183819 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" path="/var/lib/kubelet/pods/f2c695fc-9520-4414-a48d-658077fb94c0/volumes" Apr 24 19:52:32.148520 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:32.148486 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:32.150060 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:32.150035 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 24 19:52:33.158105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:33.158063 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 24 19:52:38.163836 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:38.163809 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:52:38.164313 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:38.164284 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 24 19:52:48.164839 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:52:48.164809 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:53:00.033430 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.033403 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-777bccbf48-rzdnn_89be8d83-3e69-4aff-beee-ea2f04dbf7b5/kserve-container/0.log" Apr 24 19:53:00.213862 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.213829 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:53:00.214185 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.214137 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" containerID="cri-o://a6eb6912247ec7dc28fd24b7ed946e07963d8cbdeb36b13aa7f94390c8adf901" gracePeriod=30 Apr 24 19:53:00.214321 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.214183 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kube-rbac-proxy" containerID="cri-o://bcd78891c603e52b893f50310554b73ab1d921c65b0eb665448eaa7eadfc4407" gracePeriod=30 Apr 24 19:53:00.297546 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297458 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:53:00.297814 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297800 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="storage-initializer" Apr 24 19:53:00.297858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297815 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="storage-initializer" Apr 24 19:53:00.297858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297825 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" Apr 24 19:53:00.297858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297831 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" Apr 24 19:53:00.297858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297847 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" Apr 24 19:53:00.297858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297853 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" Apr 24 19:53:00.298010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297897 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kube-rbac-proxy" Apr 24 19:53:00.298010 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.297907 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c695fc-9520-4414-a48d-658077fb94c0" containerName="kserve-container" Apr 24 19:53:00.300914 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.300897 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.303059 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.303033 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 24 19:53:00.303197 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.303174 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:53:00.309854 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.309819 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:53:00.445424 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.445386 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.445646 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.445444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.445646 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.445470 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgc78\" (UniqueName: \"kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.445646 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.445485 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.546538 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.546502 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.546724 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.546563 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.546724 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.546583 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qgc78\" (UniqueName: \"kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.546724 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.546603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.546967 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.546949 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.547269 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.547249 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.548908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.548861 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.555054 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.555031 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgc78\" (UniqueName: \"kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.611123 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.611086 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:00.732899 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:00.732871 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:53:00.735171 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:53:00.735144 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf55a748d_b218_4285_95a1_654a4a780e95.slice/crio-a5caf7a82074efe50e103d5f81c10b53a868b2b728f186cfa036aca7c6ae84b7 WatchSource:0}: Error finding container a5caf7a82074efe50e103d5f81c10b53a868b2b728f186cfa036aca7c6ae84b7: Status 404 returned error can't find the container with id a5caf7a82074efe50e103d5f81c10b53a868b2b728f186cfa036aca7c6ae84b7 Apr 24 19:53:01.244378 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.244345 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerStarted","Data":"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a"} Apr 24 19:53:01.244859 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.244386 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerStarted","Data":"a5caf7a82074efe50e103d5f81c10b53a868b2b728f186cfa036aca7c6ae84b7"} Apr 24 19:53:01.246480 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.246452 2564 generic.go:358] "Generic (PLEG): container finished" podID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerID="bcd78891c603e52b893f50310554b73ab1d921c65b0eb665448eaa7eadfc4407" exitCode=2 Apr 24 19:53:01.246480 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.246477 2564 generic.go:358] "Generic (PLEG): container finished" podID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerID="a6eb6912247ec7dc28fd24b7ed946e07963d8cbdeb36b13aa7f94390c8adf901" exitCode=0 Apr 24 19:53:01.246652 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.246522 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerDied","Data":"bcd78891c603e52b893f50310554b73ab1d921c65b0eb665448eaa7eadfc4407"} Apr 24 19:53:01.246652 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.246569 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerDied","Data":"a6eb6912247ec7dc28fd24b7ed946e07963d8cbdeb36b13aa7f94390c8adf901"} Apr 24 19:53:01.269192 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.269165 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:53:01.454850 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.454818 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location\") pod \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " Apr 24 19:53:01.455044 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.454859 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvzjq\" (UniqueName: \"kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq\") pod \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " Apr 24 19:53:01.455044 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.454893 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " Apr 24 19:53:01.455044 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.454921 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls\") pod \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\" (UID: \"89be8d83-3e69-4aff-beee-ea2f04dbf7b5\") " Apr 24 19:53:01.455283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.455255 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "89be8d83-3e69-4aff-beee-ea2f04dbf7b5" (UID: "89be8d83-3e69-4aff-beee-ea2f04dbf7b5"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:53:01.457036 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.457013 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq" (OuterVolumeSpecName: "kube-api-access-bvzjq") pod "89be8d83-3e69-4aff-beee-ea2f04dbf7b5" (UID: "89be8d83-3e69-4aff-beee-ea2f04dbf7b5"). InnerVolumeSpecName "kube-api-access-bvzjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:53:01.457118 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.457058 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "89be8d83-3e69-4aff-beee-ea2f04dbf7b5" (UID: "89be8d83-3e69-4aff-beee-ea2f04dbf7b5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:53:01.482258 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.482216 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "89be8d83-3e69-4aff-beee-ea2f04dbf7b5" (UID: "89be8d83-3e69-4aff-beee-ea2f04dbf7b5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:53:01.556507 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.556470 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:53:01.556507 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.556503 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bvzjq\" (UniqueName: \"kubernetes.io/projected/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-kube-api-access-bvzjq\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:53:01.556744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.556517 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:53:01.556744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:01.556531 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/89be8d83-3e69-4aff-beee-ea2f04dbf7b5-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:53:02.251150 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.251108 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" event={"ID":"89be8d83-3e69-4aff-beee-ea2f04dbf7b5","Type":"ContainerDied","Data":"c0db65275b48785f3ae25c88f2916446b33f4c1bf545c7a68dad0af0a9c0888f"} Apr 24 19:53:02.251628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.251163 2564 scope.go:117] "RemoveContainer" containerID="bcd78891c603e52b893f50310554b73ab1d921c65b0eb665448eaa7eadfc4407" Apr 24 19:53:02.251628 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.251165 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn" Apr 24 19:53:02.262164 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.261739 2564 scope.go:117] "RemoveContainer" containerID="a6eb6912247ec7dc28fd24b7ed946e07963d8cbdeb36b13aa7f94390c8adf901" Apr 24 19:53:02.269633 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.269612 2564 scope.go:117] "RemoveContainer" containerID="a8dea13542f7a2e88203c485126c6b7b676441cb291ef1437b220e838f2a86bf" Apr 24 19:53:02.275092 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.275065 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:53:02.278221 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:02.278199 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-777bccbf48-rzdnn"] Apr 24 19:53:03.184455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:03.184418 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" path="/var/lib/kubelet/pods/89be8d83-3e69-4aff-beee-ea2f04dbf7b5/volumes" Apr 24 19:53:05.262709 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:05.262671 2564 generic.go:358] "Generic (PLEG): container finished" podID="f55a748d-b218-4285-95a1-654a4a780e95" containerID="cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a" exitCode=0 Apr 24 19:53:05.263191 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:05.262714 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerDied","Data":"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a"} Apr 24 19:53:06.266837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:06.266753 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerStarted","Data":"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34"} Apr 24 19:53:06.266837 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:06.266797 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerStarted","Data":"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4"} Apr 24 19:53:06.267367 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:06.267021 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:06.267367 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:06.267045 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:06.286442 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:06.286395 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podStartSLOduration=6.286378857 podStartE2EDuration="6.286378857s" podCreationTimestamp="2026-04-24 19:53:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:53:06.285080565 +0000 UTC m=+2779.609324750" watchObservedRunningTime="2026-04-24 19:53:06.286378857 +0000 UTC m=+2779.610623020" Apr 24 19:53:12.275325 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:12.275293 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:53:42.318463 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:42.318403 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 19:53:52.277770 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:53:52.277741 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:54:00.338600 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.338541 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:54:00.338980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.338896 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" containerID="cri-o://30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4" gracePeriod=30 Apr 24 19:54:00.338980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.338941 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" containerID="cri-o://a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34" gracePeriod=30 Apr 24 19:54:00.441296 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441260 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:54:00.441564 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441540 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="storage-initializer" Apr 24 19:54:00.441614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441567 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="storage-initializer" Apr 24 19:54:00.441614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441579 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kube-rbac-proxy" Apr 24 19:54:00.441614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441585 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kube-rbac-proxy" Apr 24 19:54:00.441614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441601 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" Apr 24 19:54:00.441614 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441610 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" Apr 24 19:54:00.441776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441663 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kube-rbac-proxy" Apr 24 19:54:00.441776 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.441674 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="89be8d83-3e69-4aff-beee-ea2f04dbf7b5" containerName="kserve-container" Apr 24 19:54:00.444889 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.444868 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.447144 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.447115 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 24 19:54:00.447283 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.447229 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 24 19:54:00.453768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.453746 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:54:00.538319 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.538285 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.538522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.538356 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.538522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.538404 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.538522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.538465 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c66ds\" (UniqueName: \"kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.638994 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639075 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.639046 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.639083 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c66ds\" (UniqueName: \"kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.639138 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639263 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:54:00.639228 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-predictor-serving-cert: secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 19:54:00.639409 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:54:00.639310 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls podName:9a9688b5-9a3e-4ca5-9930-f2d2a51e1086 nodeName:}" failed. No retries permitted until 2026-04-24 19:54:01.13928596 +0000 UTC m=+2834.463530107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls") pod "isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" (UID: "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086") : secret "isvc-sklearn-v2-predictor-serving-cert" not found Apr 24 19:54:00.639475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.639433 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.639768 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.639748 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:00.647393 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:00.647365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c66ds\" (UniqueName: \"kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:01.143304 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.143268 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:01.145737 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.145717 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") pod \"isvc-sklearn-v2-predictor-78897cc5fb-zbtx5\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:01.356346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.356308 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:01.419938 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.419902 2564 generic.go:358] "Generic (PLEG): container finished" podID="f55a748d-b218-4285-95a1-654a4a780e95" containerID="a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34" exitCode=2 Apr 24 19:54:01.420119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.419958 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerDied","Data":"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34"} Apr 24 19:54:01.475742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.475707 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:54:01.479442 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:54:01.479404 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a9688b5_9a3e_4ca5_9930_f2d2a51e1086.slice/crio-cceb85d48ff71bc090a69b2cf32a7da7e8b30e714a318bf6261005eec528f199 WatchSource:0}: Error finding container cceb85d48ff71bc090a69b2cf32a7da7e8b30e714a318bf6261005eec528f199: Status 404 returned error can't find the container with id cceb85d48ff71bc090a69b2cf32a7da7e8b30e714a318bf6261005eec528f199 Apr 24 19:54:01.481222 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:01.481207 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 19:54:02.270315 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:02.270270 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 24 19:54:02.424129 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:02.424091 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerStarted","Data":"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45"} Apr 24 19:54:02.424129 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:02.424131 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerStarted","Data":"cceb85d48ff71bc090a69b2cf32a7da7e8b30e714a318bf6261005eec528f199"} Apr 24 19:54:03.317897 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:03.317848 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 19:54:05.433606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:05.433565 2564 generic.go:358] "Generic (PLEG): container finished" podID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerID="be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45" exitCode=0 Apr 24 19:54:05.433980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:05.433613 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerDied","Data":"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45"} Apr 24 19:54:06.438966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:06.438931 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerStarted","Data":"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317"} Apr 24 19:54:06.438966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:06.438973 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerStarted","Data":"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6"} Apr 24 19:54:06.439503 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:06.439203 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:06.458334 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:06.458284 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podStartSLOduration=6.458267687 podStartE2EDuration="6.458267687s" podCreationTimestamp="2026-04-24 19:54:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:54:06.456355024 +0000 UTC m=+2839.780599185" watchObservedRunningTime="2026-04-24 19:54:06.458267687 +0000 UTC m=+2839.782511849" Apr 24 19:54:07.270492 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.270447 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 24 19:54:07.442412 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.442375 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:07.443603 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.443573 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:07.881644 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.881620 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:54:07.896812 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.896784 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"f55a748d-b218-4285-95a1-654a4a780e95\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " Apr 24 19:54:07.896971 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.896838 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location\") pod \"f55a748d-b218-4285-95a1-654a4a780e95\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " Apr 24 19:54:07.896971 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.896867 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgc78\" (UniqueName: \"kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78\") pod \"f55a748d-b218-4285-95a1-654a4a780e95\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " Apr 24 19:54:07.896971 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.896919 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls\") pod \"f55a748d-b218-4285-95a1-654a4a780e95\" (UID: \"f55a748d-b218-4285-95a1-654a4a780e95\") " Apr 24 19:54:07.897175 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.897149 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "f55a748d-b218-4285-95a1-654a4a780e95" (UID: "f55a748d-b218-4285-95a1-654a4a780e95"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:54:07.897229 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.897203 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f55a748d-b218-4285-95a1-654a4a780e95" (UID: "f55a748d-b218-4285-95a1-654a4a780e95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:54:07.899053 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.899029 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "f55a748d-b218-4285-95a1-654a4a780e95" (UID: "f55a748d-b218-4285-95a1-654a4a780e95"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:54:07.899147 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.899092 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78" (OuterVolumeSpecName: "kube-api-access-qgc78") pod "f55a748d-b218-4285-95a1-654a4a780e95" (UID: "f55a748d-b218-4285-95a1-654a4a780e95"). InnerVolumeSpecName "kube-api-access-qgc78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:54:07.998150 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.998064 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/f55a748d-b218-4285-95a1-654a4a780e95-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:54:07.998150 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.998094 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f55a748d-b218-4285-95a1-654a4a780e95-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:54:07.998150 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.998105 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qgc78\" (UniqueName: \"kubernetes.io/projected/f55a748d-b218-4285-95a1-654a4a780e95-kube-api-access-qgc78\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:54:07.998150 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:07.998114 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/f55a748d-b218-4285-95a1-654a4a780e95-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:54:08.447250 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.447209 2564 generic.go:358] "Generic (PLEG): container finished" podID="f55a748d-b218-4285-95a1-654a4a780e95" containerID="30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4" exitCode=0 Apr 24 19:54:08.447760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.447282 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" Apr 24 19:54:08.447760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.447283 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerDied","Data":"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4"} Apr 24 19:54:08.447760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.447385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm" event={"ID":"f55a748d-b218-4285-95a1-654a4a780e95","Type":"ContainerDied","Data":"a5caf7a82074efe50e103d5f81c10b53a868b2b728f186cfa036aca7c6ae84b7"} Apr 24 19:54:08.447760 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.447405 2564 scope.go:117] "RemoveContainer" containerID="a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34" Apr 24 19:54:08.448080 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.448054 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:08.460779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.460760 2564 scope.go:117] "RemoveContainer" containerID="30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4" Apr 24 19:54:08.467985 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.467968 2564 scope.go:117] "RemoveContainer" containerID="cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a" Apr 24 19:54:08.474348 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.474323 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:54:08.475641 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.475617 2564 scope.go:117] "RemoveContainer" containerID="a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34" Apr 24 19:54:08.475943 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:54:08.475918 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34\": container with ID starting with a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34 not found: ID does not exist" containerID="a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34" Apr 24 19:54:08.475999 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.475953 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34"} err="failed to get container status \"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34\": rpc error: code = NotFound desc = could not find container \"a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34\": container with ID starting with a1eb63eda30748bd51a03a09cf75fb6642e9a68e145c6590c4e9bb855a1fbb34 not found: ID does not exist" Apr 24 19:54:08.475999 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.475973 2564 scope.go:117] "RemoveContainer" containerID="30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4" Apr 24 19:54:08.476226 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:54:08.476206 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4\": container with ID starting with 30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4 not found: ID does not exist" containerID="30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4" Apr 24 19:54:08.476280 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.476233 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4"} err="failed to get container status \"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4\": rpc error: code = NotFound desc = could not find container \"30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4\": container with ID starting with 30295e2aa5650b0f6845ae88578d7b7bfcb7d28edb51fbe8a4462765207ba7d4 not found: ID does not exist" Apr 24 19:54:08.476280 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.476250 2564 scope.go:117] "RemoveContainer" containerID="cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a" Apr 24 19:54:08.476473 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:54:08.476455 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a\": container with ID starting with cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a not found: ID does not exist" containerID="cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a" Apr 24 19:54:08.476509 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.476479 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a"} err="failed to get container status \"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a\": rpc error: code = NotFound desc = could not find container \"cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a\": container with ID starting with cab999e6fcdfe817e536b00ec0031ed2ea17cb522fc55b2e4c74679a0a75fc6a not found: ID does not exist" Apr 24 19:54:08.478690 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:08.478672 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-vbjlm"] Apr 24 19:54:09.184070 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:09.184037 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f55a748d-b218-4285-95a1-654a4a780e95" path="/var/lib/kubelet/pods/f55a748d-b218-4285-95a1-654a4a780e95/volumes" Apr 24 19:54:13.451686 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:13.451654 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:54:13.452187 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:13.452156 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:23.452866 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:23.452814 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:33.452894 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:33.452855 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:43.453104 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:43.453015 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:54:53.452203 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:54:53.452162 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:55:03.453063 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:03.453022 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:55:13.453216 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:13.453179 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:55:20.616097 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.616055 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:55:20.616629 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.616378 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" containerID="cri-o://3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6" gracePeriod=30 Apr 24 19:55:20.616725 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.616693 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kube-rbac-proxy" containerID="cri-o://0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317" gracePeriod=30 Apr 24 19:55:20.688966 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.688934 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:55:20.689217 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689205 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" Apr 24 19:55:20.689278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689218 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" Apr 24 19:55:20.689278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689234 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="storage-initializer" Apr 24 19:55:20.689278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689239 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="storage-initializer" Apr 24 19:55:20.689278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689253 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" Apr 24 19:55:20.689278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689260 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" Apr 24 19:55:20.689475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689306 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kserve-container" Apr 24 19:55:20.689475 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.689318 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f55a748d-b218-4285-95a1-654a4a780e95" containerName="kube-rbac-proxy" Apr 24 19:55:20.692473 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.692453 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.694871 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.694850 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 24 19:55:20.694972 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.694853 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 24 19:55:20.700980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.700955 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:55:20.763527 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.763490 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.763699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.763531 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.763699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.763615 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.763699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.763653 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk6rm\" (UniqueName: \"kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.864519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.864482 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.864519 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.864521 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk6rm\" (UniqueName: \"kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.864756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.864603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.864756 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.864622 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.864756 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:55:20.864654 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-serving-cert: secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 24 19:55:20.864756 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:55:20.864728 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls podName:4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3 nodeName:}" failed. No retries permitted until 2026-04-24 19:55:21.364709139 +0000 UTC m=+2914.688953284 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls") pod "isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" (UID: "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3") : secret "isvc-sklearn-v2-mixed-predictor-serving-cert" not found Apr 24 19:55:20.865058 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.865037 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.865317 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.865299 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:20.875652 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:20.875583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk6rm\" (UniqueName: \"kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:21.367642 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.367603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:21.369996 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.369973 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:21.602922 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.602873 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:21.658852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.658815 2564 generic.go:358] "Generic (PLEG): container finished" podID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerID="0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317" exitCode=2 Apr 24 19:55:21.659384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.658883 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerDied","Data":"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317"} Apr 24 19:55:21.727148 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:21.726882 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:55:21.729660 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:55:21.729631 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f7cd4c9_5f6b_4acb_8caf_a5089b6712b3.slice/crio-2b461590087b442e7d2696a17a3cfeef760144222080aabe0af94315016009fe WatchSource:0}: Error finding container 2b461590087b442e7d2696a17a3cfeef760144222080aabe0af94315016009fe: Status 404 returned error can't find the container with id 2b461590087b442e7d2696a17a3cfeef760144222080aabe0af94315016009fe Apr 24 19:55:22.662446 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:22.662409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerStarted","Data":"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2"} Apr 24 19:55:22.662446 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:22.662452 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerStarted","Data":"2b461590087b442e7d2696a17a3cfeef760144222080aabe0af94315016009fe"} Apr 24 19:55:23.448683 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:23.448635 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.50:8643/healthz\": dial tcp 10.132.0.50:8643: connect: connection refused" Apr 24 19:55:23.452866 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:23.452826 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 24 19:55:24.955138 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.955114 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:55:24.990745 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.990667 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c66ds\" (UniqueName: \"kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds\") pod \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " Apr 24 19:55:24.990745 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.990702 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " Apr 24 19:55:24.990745 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.990745 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") pod \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " Apr 24 19:55:24.991013 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.990763 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location\") pod \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\" (UID: \"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086\") " Apr 24 19:55:24.991111 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.991085 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" (UID: "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:55:24.991232 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.991125 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" (UID: "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:55:24.992823 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.992794 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" (UID: "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:55:24.992823 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:24.992803 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds" (OuterVolumeSpecName: "kube-api-access-c66ds") pod "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" (UID: "9a9688b5-9a3e-4ca5-9930-f2d2a51e1086"). InnerVolumeSpecName "kube-api-access-c66ds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:55:25.091513 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.091477 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c66ds\" (UniqueName: \"kubernetes.io/projected/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kube-api-access-c66ds\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:55:25.091513 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.091508 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:55:25.091513 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.091519 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:55:25.091766 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.091529 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:55:25.672208 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.672167 2564 generic.go:358] "Generic (PLEG): container finished" podID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerID="3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6" exitCode=0 Apr 24 19:55:25.672394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.672221 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerDied","Data":"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6"} Apr 24 19:55:25.672394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.672241 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" Apr 24 19:55:25.672394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.672263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5" event={"ID":"9a9688b5-9a3e-4ca5-9930-f2d2a51e1086","Type":"ContainerDied","Data":"cceb85d48ff71bc090a69b2cf32a7da7e8b30e714a318bf6261005eec528f199"} Apr 24 19:55:25.672394 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.672280 2564 scope.go:117] "RemoveContainer" containerID="0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317" Apr 24 19:55:25.680359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.680340 2564 scope.go:117] "RemoveContainer" containerID="3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6" Apr 24 19:55:25.687583 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.687563 2564 scope.go:117] "RemoveContainer" containerID="be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45" Apr 24 19:55:25.688813 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.688790 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:55:25.694214 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.694186 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-78897cc5fb-zbtx5"] Apr 24 19:55:25.695111 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695089 2564 scope.go:117] "RemoveContainer" containerID="0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317" Apr 24 19:55:25.695390 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:55:25.695372 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317\": container with ID starting with 0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317 not found: ID does not exist" containerID="0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317" Apr 24 19:55:25.695458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695400 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317"} err="failed to get container status \"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317\": rpc error: code = NotFound desc = could not find container \"0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317\": container with ID starting with 0aeb906c04a09db560e66638d8be3dd824995288aab148a3173da2bddf833317 not found: ID does not exist" Apr 24 19:55:25.695458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695420 2564 scope.go:117] "RemoveContainer" containerID="3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6" Apr 24 19:55:25.695695 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:55:25.695672 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6\": container with ID starting with 3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6 not found: ID does not exist" containerID="3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6" Apr 24 19:55:25.695774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695705 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6"} err="failed to get container status \"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6\": rpc error: code = NotFound desc = could not find container \"3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6\": container with ID starting with 3a9d55e00064651f9f5fbde6470578fcc64b0e8d53313627f2bc3d85c8f11be6 not found: ID does not exist" Apr 24 19:55:25.695774 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695728 2564 scope.go:117] "RemoveContainer" containerID="be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45" Apr 24 19:55:25.695953 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:55:25.695935 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45\": container with ID starting with be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45 not found: ID does not exist" containerID="be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45" Apr 24 19:55:25.695992 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:25.695961 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45"} err="failed to get container status \"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45\": rpc error: code = NotFound desc = could not find container \"be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45\": container with ID starting with be0fac1b82e667d105e2c207aa769b046df9dcdab7369443efb9af686ce2cb45 not found: ID does not exist" Apr 24 19:55:26.676755 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:26.676719 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerID="f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2" exitCode=0 Apr 24 19:55:26.677242 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:26.676778 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerDied","Data":"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2"} Apr 24 19:55:27.183943 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:27.183909 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" path="/var/lib/kubelet/pods/9a9688b5-9a3e-4ca5-9930-f2d2a51e1086/volumes" Apr 24 19:55:27.680943 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:27.680906 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerStarted","Data":"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad"} Apr 24 19:55:27.680943 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:27.680948 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerStarted","Data":"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e"} Apr 24 19:55:27.681383 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:27.681157 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:27.701154 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:27.701105 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podStartSLOduration=7.701092705 podStartE2EDuration="7.701092705s" podCreationTimestamp="2026-04-24 19:55:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:55:27.700139095 +0000 UTC m=+2921.024383249" watchObservedRunningTime="2026-04-24 19:55:27.701092705 +0000 UTC m=+2921.025336867" Apr 24 19:55:28.683854 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:28.683820 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:28.685119 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:28.685089 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:55:29.687043 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:29.686999 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:55:34.690918 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:34.690888 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:55:34.691386 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:34.691353 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:55:44.691598 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:44.691537 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:55:54.692077 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:55:54.692032 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:56:04.691918 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:04.691876 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:56:14.692043 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:14.691956 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:56:24.691329 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:24.691289 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:56:34.692467 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:34.692438 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:56:40.808005 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.807969 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:56:40.808513 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.808393 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" containerID="cri-o://4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e" gracePeriod=30 Apr 24 19:56:40.808513 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.808431 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kube-rbac-proxy" containerID="cri-o://3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad" gracePeriod=30 Apr 24 19:56:40.898803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.898764 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:56:40.899055 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899042 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kube-rbac-proxy" Apr 24 19:56:40.899055 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899056 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kube-rbac-proxy" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899071 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899077 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899085 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="storage-initializer" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899090 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="storage-initializer" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899135 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kserve-container" Apr 24 19:56:40.899176 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.899144 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a9688b5-9a3e-4ca5-9930-f2d2a51e1086" containerName="kube-rbac-proxy" Apr 24 19:56:40.902278 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.902253 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:40.905016 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.904987 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 24 19:56:40.905161 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.905138 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 24 19:56:40.910464 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:40.910433 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:56:41.076359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.076256 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.076359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.076301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24p6l\" (UniqueName: \"kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.076359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.076336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.076620 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.076363 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.177666 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.177636 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.177931 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.177678 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.177931 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.177738 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.177931 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.177760 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24p6l\" (UniqueName: \"kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.178116 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.178068 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.178359 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.178330 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.180231 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.180207 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.186469 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.186446 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24p6l\" (UniqueName: \"kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l\") pod \"isvc-tensorflow-predictor-6756f669d7-rqzdx\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.215481 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.215441 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:41.342447 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.342418 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:56:41.344983 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:56:41.344955 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2f2755_2b8f_4a43_adb4_3b62b3481168.slice/crio-5d314291c29e9ef2fa6f22962658e7739031c152215d8f84d7e8b3cc503f2ac6 WatchSource:0}: Error finding container 5d314291c29e9ef2fa6f22962658e7739031c152215d8f84d7e8b3cc503f2ac6: Status 404 returned error can't find the container with id 5d314291c29e9ef2fa6f22962658e7739031c152215d8f84d7e8b3cc503f2ac6 Apr 24 19:56:41.882379 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.882338 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerStarted","Data":"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6"} Apr 24 19:56:41.882379 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.882380 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerStarted","Data":"5d314291c29e9ef2fa6f22962658e7739031c152215d8f84d7e8b3cc503f2ac6"} Apr 24 19:56:41.884343 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.884315 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerID="3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad" exitCode=2 Apr 24 19:56:41.884453 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:41.884350 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerDied","Data":"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad"} Apr 24 19:56:44.687163 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:44.687123 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.51:8643/healthz\": dial tcp 10.132.0.51:8643: connect: connection refused" Apr 24 19:56:44.691406 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:44.691373 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 24 19:56:45.258417 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.258387 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:56:45.408545 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.408451 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kk6rm\" (UniqueName: \"kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm\") pod \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " Apr 24 19:56:45.408545 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.408528 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location\") pod \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " Apr 24 19:56:45.408804 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.408578 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") pod \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " Apr 24 19:56:45.408804 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.408613 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\" (UID: \"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3\") " Apr 24 19:56:45.408928 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.408876 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" (UID: "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:56:45.409080 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.409053 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" (UID: "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:56:45.410787 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.410755 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" (UID: "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:56:45.410895 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.410851 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm" (OuterVolumeSpecName: "kube-api-access-kk6rm") pod "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" (UID: "4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3"). InnerVolumeSpecName "kube-api-access-kk6rm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:56:45.509574 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.509509 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:56:45.509744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.509585 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:56:45.509744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.509604 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:56:45.509744 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.509626 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kk6rm\" (UniqueName: \"kubernetes.io/projected/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3-kube-api-access-kk6rm\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:56:45.895742 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.895710 2564 generic.go:358] "Generic (PLEG): container finished" podID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerID="bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6" exitCode=0 Apr 24 19:56:45.896202 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.895789 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerDied","Data":"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6"} Apr 24 19:56:45.897526 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.897500 2564 generic.go:358] "Generic (PLEG): container finished" podID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerID="4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e" exitCode=0 Apr 24 19:56:45.897606 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.897589 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" Apr 24 19:56:45.897656 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.897598 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerDied","Data":"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e"} Apr 24 19:56:45.897656 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.897643 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s" event={"ID":"4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3","Type":"ContainerDied","Data":"2b461590087b442e7d2696a17a3cfeef760144222080aabe0af94315016009fe"} Apr 24 19:56:45.897794 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.897664 2564 scope.go:117] "RemoveContainer" containerID="3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad" Apr 24 19:56:45.909356 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.907242 2564 scope.go:117] "RemoveContainer" containerID="4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e" Apr 24 19:56:45.917258 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.917238 2564 scope.go:117] "RemoveContainer" containerID="f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2" Apr 24 19:56:45.925143 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.925121 2564 scope.go:117] "RemoveContainer" containerID="3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad" Apr 24 19:56:45.925412 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:56:45.925391 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad\": container with ID starting with 3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad not found: ID does not exist" containerID="3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad" Apr 24 19:56:45.925506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.925428 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad"} err="failed to get container status \"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad\": rpc error: code = NotFound desc = could not find container \"3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad\": container with ID starting with 3b9eba04db166054d73af93393bfaf272cff183c96b3eab3a37c24acc86efdad not found: ID does not exist" Apr 24 19:56:45.925506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.925458 2564 scope.go:117] "RemoveContainer" containerID="4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e" Apr 24 19:56:45.925772 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:56:45.925746 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e\": container with ID starting with 4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e not found: ID does not exist" containerID="4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e" Apr 24 19:56:45.925821 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.925782 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e"} err="failed to get container status \"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e\": rpc error: code = NotFound desc = could not find container \"4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e\": container with ID starting with 4b06ac9fa3b6c0b29201f0b4e81146ad47d3b881818271162b293b8fd474df8e not found: ID does not exist" Apr 24 19:56:45.925821 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.925808 2564 scope.go:117] "RemoveContainer" containerID="f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2" Apr 24 19:56:45.926037 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:56:45.926019 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2\": container with ID starting with f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2 not found: ID does not exist" containerID="f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2" Apr 24 19:56:45.926094 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.926046 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2"} err="failed to get container status \"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2\": rpc error: code = NotFound desc = could not find container \"f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2\": container with ID starting with f1b6ceacec40a9ffa2042860d129bbba48847d463e50068b2aabbece6640abb2 not found: ID does not exist" Apr 24 19:56:45.932923 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.932887 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:56:45.937964 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:45.937937 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-6554fcd6d5-9k24s"] Apr 24 19:56:47.185980 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:47.185942 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" path="/var/lib/kubelet/pods/4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3/volumes" Apr 24 19:56:49.913940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:49.913847 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerStarted","Data":"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c"} Apr 24 19:56:49.913940 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:49.913891 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerStarted","Data":"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea"} Apr 24 19:56:49.914458 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:49.914125 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:49.932637 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:49.932583 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podStartSLOduration=6.28799955 podStartE2EDuration="9.932538413s" podCreationTimestamp="2026-04-24 19:56:40 +0000 UTC" firstStartedPulling="2026-04-24 19:56:45.897156924 +0000 UTC m=+2999.221401082" lastFinishedPulling="2026-04-24 19:56:49.5416958 +0000 UTC m=+3002.865939945" observedRunningTime="2026-04-24 19:56:49.931529044 +0000 UTC m=+3003.255773206" watchObservedRunningTime="2026-04-24 19:56:49.932538413 +0000 UTC m=+3003.256782575" Apr 24 19:56:50.917163 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:50.917121 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:50.918381 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:50.918350 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 24 19:56:51.919658 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:51.919612 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 24 19:56:56.924790 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:56.924753 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:56:56.925343 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:56:56.925319 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 24 19:57:06.925779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:06.925747 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:57:22.073372 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.073332 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:57:22.073852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.073801 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" containerID="cri-o://c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea" gracePeriod=30 Apr 24 19:57:22.073907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.073836 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" containerID="cri-o://15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c" gracePeriod=30 Apr 24 19:57:22.151636 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151597 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:57:22.151888 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151876 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="storage-initializer" Apr 24 19:57:22.151932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151889 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="storage-initializer" Apr 24 19:57:22.151932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151899 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" Apr 24 19:57:22.151932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151905 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" Apr 24 19:57:22.151932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151916 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kube-rbac-proxy" Apr 24 19:57:22.151932 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151922 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kube-rbac-proxy" Apr 24 19:57:22.152105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151976 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kserve-container" Apr 24 19:57:22.152105 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.151987 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f7cd4c9-5f6b-4acb-8caf-a5089b6712b3" containerName="kube-rbac-proxy" Apr 24 19:57:22.160199 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.160170 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.163038 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.163005 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 24 19:57:22.163302 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.163276 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 24 19:57:22.163708 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.163682 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:57:22.188948 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.188917 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.189114 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.188962 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.189114 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.189048 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.189114 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.189083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkws\" (UniqueName: \"kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.290225 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290181 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.290417 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.290417 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkws\" (UniqueName: \"kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.290417 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290336 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.290696 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290676 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.291019 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.290997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.292785 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.292761 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.298779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.298755 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkws\" (UniqueName: \"kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.471514 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.471472 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:22.591577 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:22.591535 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:57:22.594114 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:57:22.594085 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8decbe48_ebbb_409c_88c3_7ad460858abe.slice/crio-4d85482507cdb25f6d009c9c05c4eca806e0bbce48b2dab5547c1caeef87bdce WatchSource:0}: Error finding container 4d85482507cdb25f6d009c9c05c4eca806e0bbce48b2dab5547c1caeef87bdce: Status 404 returned error can't find the container with id 4d85482507cdb25f6d009c9c05c4eca806e0bbce48b2dab5547c1caeef87bdce Apr 24 19:57:23.006620 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:23.006582 2564 generic.go:358] "Generic (PLEG): container finished" podID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerID="15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c" exitCode=2 Apr 24 19:57:23.006796 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:23.006654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerDied","Data":"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c"} Apr 24 19:57:23.007845 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:23.007813 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerStarted","Data":"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22"} Apr 24 19:57:23.007845 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:23.007846 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerStarted","Data":"4d85482507cdb25f6d009c9c05c4eca806e0bbce48b2dab5547c1caeef87bdce"} Apr 24 19:57:26.920284 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:26.920231 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:28.023655 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:28.023623 2564 generic.go:358] "Generic (PLEG): container finished" podID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerID="d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22" exitCode=0 Apr 24 19:57:28.024049 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:28.023672 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerDied","Data":"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22"} Apr 24 19:57:29.027890 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.027857 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerStarted","Data":"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155"} Apr 24 19:57:29.027890 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.027896 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerStarted","Data":"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857"} Apr 24 19:57:29.028522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.028174 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:29.028522 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.028287 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:29.029383 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.029358 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 24 19:57:29.045585 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:29.045526 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podStartSLOduration=7.045514546 podStartE2EDuration="7.045514546s" podCreationTimestamp="2026-04-24 19:57:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 19:57:29.044311595 +0000 UTC m=+3042.368555758" watchObservedRunningTime="2026-04-24 19:57:29.045514546 +0000 UTC m=+3042.369758709" Apr 24 19:57:30.030895 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:30.030853 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 24 19:57:31.920668 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:31.920609 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:35.035699 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:35.035670 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:35.036270 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:35.036242 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 24 19:57:36.920367 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:36.920248 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:36.920779 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:36.920459 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:57:41.920497 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:41.920451 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:45.036784 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:45.036699 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:57:46.920907 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:46.920856 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:51.919893 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:51.919846 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 24 19:57:52.706631 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.706608 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:57:52.843380 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.843289 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location\") pod \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " Apr 24 19:57:52.843380 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.843343 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " Apr 24 19:57:52.843380 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.843369 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls\") pod \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " Apr 24 19:57:52.843718 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.843393 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24p6l\" (UniqueName: \"kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l\") pod \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\" (UID: \"6e2f2755-2b8f-4a43-adb4-3b62b3481168\") " Apr 24 19:57:52.843833 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.843800 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "6e2f2755-2b8f-4a43-adb4-3b62b3481168" (UID: "6e2f2755-2b8f-4a43-adb4-3b62b3481168"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:57:52.845596 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.845567 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l" (OuterVolumeSpecName: "kube-api-access-24p6l") pod "6e2f2755-2b8f-4a43-adb4-3b62b3481168" (UID: "6e2f2755-2b8f-4a43-adb4-3b62b3481168"). InnerVolumeSpecName "kube-api-access-24p6l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:57:52.845701 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.845675 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6e2f2755-2b8f-4a43-adb4-3b62b3481168" (UID: "6e2f2755-2b8f-4a43-adb4-3b62b3481168"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:57:52.854360 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.854327 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6e2f2755-2b8f-4a43-adb4-3b62b3481168" (UID: "6e2f2755-2b8f-4a43-adb4-3b62b3481168"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:57:52.944896 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.944856 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:57:52.944896 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.944889 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6e2f2755-2b8f-4a43-adb4-3b62b3481168-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:57:52.944896 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.944902 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6e2f2755-2b8f-4a43-adb4-3b62b3481168-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:57:52.945346 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:52.944912 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24p6l\" (UniqueName: \"kubernetes.io/projected/6e2f2755-2b8f-4a43-adb4-3b62b3481168-kube-api-access-24p6l\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:57:53.098934 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.098839 2564 generic.go:358] "Generic (PLEG): container finished" podID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerID="c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea" exitCode=137 Apr 24 19:57:53.099083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.098928 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerDied","Data":"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea"} Apr 24 19:57:53.099083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.098942 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" Apr 24 19:57:53.099083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.098968 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx" event={"ID":"6e2f2755-2b8f-4a43-adb4-3b62b3481168","Type":"ContainerDied","Data":"5d314291c29e9ef2fa6f22962658e7739031c152215d8f84d7e8b3cc503f2ac6"} Apr 24 19:57:53.099083 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.098985 2564 scope.go:117] "RemoveContainer" containerID="15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c" Apr 24 19:57:53.106870 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.106614 2564 scope.go:117] "RemoveContainer" containerID="c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea" Apr 24 19:57:53.114052 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.114035 2564 scope.go:117] "RemoveContainer" containerID="bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6" Apr 24 19:57:53.121611 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.121584 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:57:53.121861 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.121844 2564 scope.go:117] "RemoveContainer" containerID="15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c" Apr 24 19:57:53.122210 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:57:53.122178 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c\": container with ID starting with 15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c not found: ID does not exist" containerID="15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c" Apr 24 19:57:53.122384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.122217 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c"} err="failed to get container status \"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c\": rpc error: code = NotFound desc = could not find container \"15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c\": container with ID starting with 15b676bd6b8924c0acd119a94c47d2975fcd07b31fe17db86944264575c5ce0c not found: ID does not exist" Apr 24 19:57:53.122384 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.122237 2564 scope.go:117] "RemoveContainer" containerID="c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea" Apr 24 19:57:53.122576 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:57:53.122496 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea\": container with ID starting with c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea not found: ID does not exist" containerID="c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea" Apr 24 19:57:53.122576 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.122517 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea"} err="failed to get container status \"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea\": rpc error: code = NotFound desc = could not find container \"c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea\": container with ID starting with c7d3b37bd4e18e33dd6b259c92af76fc1cb513b30c32de5f04322df6814e30ea not found: ID does not exist" Apr 24 19:57:53.122576 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.122531 2564 scope.go:117] "RemoveContainer" containerID="bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6" Apr 24 19:57:53.122846 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:57:53.122830 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6\": container with ID starting with bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6 not found: ID does not exist" containerID="bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6" Apr 24 19:57:53.122919 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.122848 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6"} err="failed to get container status \"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6\": rpc error: code = NotFound desc = could not find container \"bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6\": container with ID starting with bc3b2fca0eefcb19d9857e2a48c41b5da3e20d19839fac0193cde6765b2171d6 not found: ID does not exist" Apr 24 19:57:53.127308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.127287 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-rqzdx"] Apr 24 19:57:53.184722 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:57:53.184686 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" path="/var/lib/kubelet/pods/6e2f2755-2b8f-4a43-adb4-3b62b3481168/volumes" Apr 24 19:58:03.547217 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.547183 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:58:03.547626 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.547574 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" containerID="cri-o://0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857" gracePeriod=30 Apr 24 19:58:03.547728 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.547636 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" containerID="cri-o://f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155" gracePeriod=30 Apr 24 19:58:03.653530 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653495 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 19:58:03.653870 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653852 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" Apr 24 19:58:03.653952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653873 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" Apr 24 19:58:03.653952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653889 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="storage-initializer" Apr 24 19:58:03.653952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653897 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="storage-initializer" Apr 24 19:58:03.653952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653915 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" Apr 24 19:58:03.653952 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653926 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" Apr 24 19:58:03.654206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.653998 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kserve-container" Apr 24 19:58:03.654206 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.654014 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="6e2f2755-2b8f-4a43-adb4-3b62b3481168" containerName="kube-rbac-proxy" Apr 24 19:58:03.657240 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.657219 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.659683 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.659662 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 24 19:58:03.659683 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.659672 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 24 19:58:03.665403 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.665377 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 19:58:03.827455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.827362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.827455 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.827432 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggdrh\" (UniqueName: \"kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.827665 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.827460 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.827665 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.827490 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928071 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928279 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928279 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928114 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928279 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggdrh\" (UniqueName: \"kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928573 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928525 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.928801 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.928778 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.930637 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.930616 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.937060 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.937034 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggdrh\" (UniqueName: \"kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh\") pod \"isvc-triton-predictor-84bb65d94b-nbfjn\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:03.968532 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:03.968487 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 19:58:04.090385 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:04.090351 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 19:58:04.093112 ip-10-0-129-124 kubenswrapper[2564]: W0424 19:58:04.093082 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddef9c1eb_27f2_4770_9867_122466fc256f.slice/crio-3ad94f4a740dcd2fc5b49fb7b8998f5f03df8280441afb6f34c9295c24b62f14 WatchSource:0}: Error finding container 3ad94f4a740dcd2fc5b49fb7b8998f5f03df8280441afb6f34c9295c24b62f14: Status 404 returned error can't find the container with id 3ad94f4a740dcd2fc5b49fb7b8998f5f03df8280441afb6f34c9295c24b62f14 Apr 24 19:58:04.133908 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:04.133878 2564 generic.go:358] "Generic (PLEG): container finished" podID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerID="f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155" exitCode=2 Apr 24 19:58:04.134077 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:04.133948 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerDied","Data":"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155"} Apr 24 19:58:04.135155 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:04.135130 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerStarted","Data":"3ad94f4a740dcd2fc5b49fb7b8998f5f03df8280441afb6f34c9295c24b62f14"} Apr 24 19:58:05.031803 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:05.031756 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:05.139210 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:05.139171 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerStarted","Data":"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa"} Apr 24 19:58:08.149055 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:08.149019 2564 generic.go:358] "Generic (PLEG): container finished" podID="def9c1eb-27f2-4770-9867-122466fc256f" containerID="8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa" exitCode=0 Apr 24 19:58:08.149531 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:08.149094 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerDied","Data":"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa"} Apr 24 19:58:10.031739 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:10.031689 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:15.031919 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:15.031874 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:15.032537 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:15.032025 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:58:20.032020 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:20.031971 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:25.031852 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:25.031799 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:30.032051 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:30.031998 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 24 19:58:34.233506 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.233453 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:58:34.257476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.257310 2564 generic.go:358] "Generic (PLEG): container finished" podID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerID="0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857" exitCode=137 Apr 24 19:58:34.257476 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.257431 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" Apr 24 19:58:34.257848 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.257816 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerDied","Data":"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857"} Apr 24 19:58:34.257990 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.257863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv" event={"ID":"8decbe48-ebbb-409c-88c3-7ad460858abe","Type":"ContainerDied","Data":"4d85482507cdb25f6d009c9c05c4eca806e0bbce48b2dab5547c1caeef87bdce"} Apr 24 19:58:34.257990 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.257886 2564 scope.go:117] "RemoveContainer" containerID="f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155" Apr 24 19:58:34.268539 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.268502 2564 scope.go:117] "RemoveContainer" containerID="0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857" Apr 24 19:58:34.278308 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.278283 2564 scope.go:117] "RemoveContainer" containerID="d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22" Apr 24 19:58:34.287858 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.287833 2564 scope.go:117] "RemoveContainer" containerID="f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155" Apr 24 19:58:34.288257 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:58:34.288228 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155\": container with ID starting with f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155 not found: ID does not exist" containerID="f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155" Apr 24 19:58:34.288342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.288271 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155"} err="failed to get container status \"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155\": rpc error: code = NotFound desc = could not find container \"f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155\": container with ID starting with f68f1f0d08103f634c44618fd76dfd5e6feafdfe204e8c3bd9bfc888bec0c155 not found: ID does not exist" Apr 24 19:58:34.288342 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.288299 2564 scope.go:117] "RemoveContainer" containerID="0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857" Apr 24 19:58:34.288755 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:58:34.288715 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857\": container with ID starting with 0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857 not found: ID does not exist" containerID="0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857" Apr 24 19:58:34.288868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.288752 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857"} err="failed to get container status \"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857\": rpc error: code = NotFound desc = could not find container \"0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857\": container with ID starting with 0164a2ef78e3bc1768b167a9f37ff57b0618eb61c2f11bd15de89b0f00079857 not found: ID does not exist" Apr 24 19:58:34.288868 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.288776 2564 scope.go:117] "RemoveContainer" containerID="d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22" Apr 24 19:58:34.289101 ip-10-0-129-124 kubenswrapper[2564]: E0424 19:58:34.289072 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22\": container with ID starting with d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22 not found: ID does not exist" containerID="d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22" Apr 24 19:58:34.289182 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.289113 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22"} err="failed to get container status \"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22\": rpc error: code = NotFound desc = could not find container \"d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22\": container with ID starting with d21b2434292af55d635beb689a150db934612660d19da3e0df6bf75f6139de22 not found: ID does not exist" Apr 24 19:58:34.292426 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.292404 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location\") pod \"8decbe48-ebbb-409c-88c3-7ad460858abe\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " Apr 24 19:58:34.292531 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.292451 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls\") pod \"8decbe48-ebbb-409c-88c3-7ad460858abe\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " Apr 24 19:58:34.292531 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.292512 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgkws\" (UniqueName: \"kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws\") pod \"8decbe48-ebbb-409c-88c3-7ad460858abe\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " Apr 24 19:58:34.292669 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.292600 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"8decbe48-ebbb-409c-88c3-7ad460858abe\" (UID: \"8decbe48-ebbb-409c-88c3-7ad460858abe\") " Apr 24 19:58:34.293582 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.293540 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "8decbe48-ebbb-409c-88c3-7ad460858abe" (UID: "8decbe48-ebbb-409c-88c3-7ad460858abe"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 19:58:34.297810 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.297773 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8decbe48-ebbb-409c-88c3-7ad460858abe" (UID: "8decbe48-ebbb-409c-88c3-7ad460858abe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 19:58:34.298356 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.298327 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws" (OuterVolumeSpecName: "kube-api-access-pgkws") pod "8decbe48-ebbb-409c-88c3-7ad460858abe" (UID: "8decbe48-ebbb-409c-88c3-7ad460858abe"). InnerVolumeSpecName "kube-api-access-pgkws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 19:58:34.302263 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.302233 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8decbe48-ebbb-409c-88c3-7ad460858abe" (UID: "8decbe48-ebbb-409c-88c3-7ad460858abe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 19:58:34.394242 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.394151 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8decbe48-ebbb-409c-88c3-7ad460858abe-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:58:34.394242 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.394193 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8decbe48-ebbb-409c-88c3-7ad460858abe-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:58:34.394242 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.394207 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8decbe48-ebbb-409c-88c3-7ad460858abe-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:58:34.394542 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.394246 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pgkws\" (UniqueName: \"kubernetes.io/projected/8decbe48-ebbb-409c-88c3-7ad460858abe-kube-api-access-pgkws\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 19:58:34.585721 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.585671 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:58:34.588770 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:34.588737 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-ljdwv"] Apr 24 19:58:35.186132 ip-10-0-129-124 kubenswrapper[2564]: I0424 19:58:35.186085 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" path="/var/lib/kubelet/pods/8decbe48-ebbb-409c-88c3-7ad460858abe/volumes" Apr 24 20:00:02.678878 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:02.678850 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 20:00:03.536086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:03.536049 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerStarted","Data":"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a"} Apr 24 20:00:03.536086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:03.536090 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerStarted","Data":"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb"} Apr 24 20:00:03.536512 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:03.536189 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:03.559444 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:03.559388 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podStartSLOduration=6.15901406 podStartE2EDuration="2m0.559366546s" podCreationTimestamp="2026-04-24 19:58:03 +0000 UTC" firstStartedPulling="2026-04-24 19:58:08.150278162 +0000 UTC m=+3081.474522304" lastFinishedPulling="2026-04-24 20:00:02.550630649 +0000 UTC m=+3195.874874790" observedRunningTime="2026-04-24 20:00:03.55757269 +0000 UTC m=+3196.881816864" watchObservedRunningTime="2026-04-24 20:00:03.559366546 +0000 UTC m=+3196.883610712" Apr 24 20:00:04.538717 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:04.538689 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:04.539712 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:04.539686 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 24 20:00:05.541183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:05.541138 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 24 20:00:10.545213 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:10.545184 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:10.546085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:10.546065 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:15.107959 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.107923 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 20:00:15.108472 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.108337 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" containerID="cri-o://6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb" gracePeriod=30 Apr 24 20:00:15.108472 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.108400 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kube-rbac-proxy" containerID="cri-o://e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a" gracePeriod=30 Apr 24 20:00:15.217874 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.217839 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:00:15.218263 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218246 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="storage-initializer" Apr 24 20:00:15.218317 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218267 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="storage-initializer" Apr 24 20:00:15.218317 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218279 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" Apr 24 20:00:15.218317 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218289 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" Apr 24 20:00:15.218423 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218317 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" Apr 24 20:00:15.218423 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218325 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" Apr 24 20:00:15.218423 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218400 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kube-rbac-proxy" Apr 24 20:00:15.218423 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.218412 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8decbe48-ebbb-409c-88c3-7ad460858abe" containerName="kserve-container" Apr 24 20:00:15.233348 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.233315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:00:15.233524 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.233415 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.236037 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.236011 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 24 20:00:15.236154 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.236140 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 24 20:00:15.364942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.364846 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdtz5\" (UniqueName: \"kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.364942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.364889 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.364942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.364931 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.365179 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.364973 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.465381 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.465339 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.465629 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.465402 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bdtz5\" (UniqueName: \"kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.465629 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.465430 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.465629 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.465471 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.465833 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.465801 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.466134 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.466112 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.467941 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.467920 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.475034 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.474997 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdtz5\" (UniqueName: \"kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5\") pod \"isvc-xgboost-predictor-8689c4cfcc-89622\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.542212 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.542167 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.54:8643/healthz\": dial tcp 10.132.0.54:8643: connect: connection refused" Apr 24 20:00:15.543309 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.543282 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:15.569669 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.569636 2564 generic.go:358] "Generic (PLEG): container finished" podID="def9c1eb-27f2-4770-9867-122466fc256f" containerID="e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a" exitCode=2 Apr 24 20:00:15.569669 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.569662 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerDied","Data":"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a"} Apr 24 20:00:15.661942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:15.661908 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:00:15.664941 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:00:15.664917 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d5f3449_0cd3_4f31_a32f_d9d6573c3662.slice/crio-441b5e319a3be9522a2c2286ac7498d695209970a44561633a33e4101b56a6fe WatchSource:0}: Error finding container 441b5e319a3be9522a2c2286ac7498d695209970a44561633a33e4101b56a6fe: Status 404 returned error can't find the container with id 441b5e319a3be9522a2c2286ac7498d695209970a44561633a33e4101b56a6fe Apr 24 20:00:16.574022 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:16.573989 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerStarted","Data":"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496"} Apr 24 20:00:16.574022 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:16.574024 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerStarted","Data":"441b5e319a3be9522a2c2286ac7498d695209970a44561633a33e4101b56a6fe"} Apr 24 20:00:17.360396 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.360368 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:17.481942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.481911 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls\") pod \"def9c1eb-27f2-4770-9867-122466fc256f\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " Apr 24 20:00:17.482134 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.481973 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config\") pod \"def9c1eb-27f2-4770-9867-122466fc256f\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " Apr 24 20:00:17.482134 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.482017 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location\") pod \"def9c1eb-27f2-4770-9867-122466fc256f\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " Apr 24 20:00:17.482134 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.482057 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggdrh\" (UniqueName: \"kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh\") pod \"def9c1eb-27f2-4770-9867-122466fc256f\" (UID: \"def9c1eb-27f2-4770-9867-122466fc256f\") " Apr 24 20:00:17.482354 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.482328 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "def9c1eb-27f2-4770-9867-122466fc256f" (UID: "def9c1eb-27f2-4770-9867-122466fc256f"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:00:17.482434 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.482416 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "def9c1eb-27f2-4770-9867-122466fc256f" (UID: "def9c1eb-27f2-4770-9867-122466fc256f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:00:17.484157 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.484132 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "def9c1eb-27f2-4770-9867-122466fc256f" (UID: "def9c1eb-27f2-4770-9867-122466fc256f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:00:17.484261 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.484218 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh" (OuterVolumeSpecName: "kube-api-access-ggdrh") pod "def9c1eb-27f2-4770-9867-122466fc256f" (UID: "def9c1eb-27f2-4770-9867-122466fc256f"). InnerVolumeSpecName "kube-api-access-ggdrh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:00:17.578404 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.578369 2564 generic.go:358] "Generic (PLEG): container finished" podID="def9c1eb-27f2-4770-9867-122466fc256f" containerID="6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb" exitCode=0 Apr 24 20:00:17.578858 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.578446 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" Apr 24 20:00:17.578858 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.578460 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerDied","Data":"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb"} Apr 24 20:00:17.578858 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.578502 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn" event={"ID":"def9c1eb-27f2-4770-9867-122466fc256f","Type":"ContainerDied","Data":"3ad94f4a740dcd2fc5b49fb7b8998f5f03df8280441afb6f34c9295c24b62f14"} Apr 24 20:00:17.578858 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.578521 2564 scope.go:117] "RemoveContainer" containerID="e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a" Apr 24 20:00:17.582683 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.582660 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/def9c1eb-27f2-4770-9867-122466fc256f-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:00:17.582796 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.582688 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/def9c1eb-27f2-4770-9867-122466fc256f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:00:17.582796 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.582706 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggdrh\" (UniqueName: \"kubernetes.io/projected/def9c1eb-27f2-4770-9867-122466fc256f-kube-api-access-ggdrh\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:00:17.582796 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.582723 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/def9c1eb-27f2-4770-9867-122466fc256f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:00:17.589375 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.589255 2564 scope.go:117] "RemoveContainer" containerID="6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb" Apr 24 20:00:17.596669 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.596650 2564 scope.go:117] "RemoveContainer" containerID="8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa" Apr 24 20:00:17.603766 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.603746 2564 scope.go:117] "RemoveContainer" containerID="e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a" Apr 24 20:00:17.604020 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:00:17.603999 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a\": container with ID starting with e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a not found: ID does not exist" containerID="e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a" Apr 24 20:00:17.604121 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604026 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a"} err="failed to get container status \"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a\": rpc error: code = NotFound desc = could not find container \"e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a\": container with ID starting with e5527d8fbbe20e103ed390dc27af02c0650aa6b42c37f7ab962574945563fe2a not found: ID does not exist" Apr 24 20:00:17.604121 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604044 2564 scope.go:117] "RemoveContainer" containerID="6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb" Apr 24 20:00:17.604329 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:00:17.604315 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb\": container with ID starting with 6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb not found: ID does not exist" containerID="6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb" Apr 24 20:00:17.604383 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604332 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb"} err="failed to get container status \"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb\": rpc error: code = NotFound desc = could not find container \"6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb\": container with ID starting with 6596147ad4818543e68057e07eb319fe5bab6a5af96e5a4b8dcf7a21ed29a6cb not found: ID does not exist" Apr 24 20:00:17.604383 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604344 2564 scope.go:117] "RemoveContainer" containerID="8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa" Apr 24 20:00:17.604383 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604366 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 20:00:17.604577 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:00:17.604534 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa\": container with ID starting with 8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa not found: ID does not exist" containerID="8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa" Apr 24 20:00:17.604577 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.604569 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa"} err="failed to get container status \"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa\": rpc error: code = NotFound desc = could not find container \"8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa\": container with ID starting with 8be79d4d74e57b985a6e7da0ddc8eea342097416d79fe314a31a20541548fcfa not found: ID does not exist" Apr 24 20:00:17.607947 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:17.607921 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-nbfjn"] Apr 24 20:00:19.184442 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:19.184399 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def9c1eb-27f2-4770-9867-122466fc256f" path="/var/lib/kubelet/pods/def9c1eb-27f2-4770-9867-122466fc256f/volumes" Apr 24 20:00:20.588443 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:20.588408 2564 generic.go:358] "Generic (PLEG): container finished" podID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerID="7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496" exitCode=0 Apr 24 20:00:20.588942 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:20.588490 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerDied","Data":"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496"} Apr 24 20:00:40.651106 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:40.651013 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerStarted","Data":"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6"} Apr 24 20:00:40.651106 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:40.651053 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerStarted","Data":"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955"} Apr 24 20:00:40.651708 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:40.651286 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:40.669215 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:40.669167 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podStartSLOduration=5.923256612 podStartE2EDuration="25.669153745s" podCreationTimestamp="2026-04-24 20:00:15 +0000 UTC" firstStartedPulling="2026-04-24 20:00:20.589690005 +0000 UTC m=+3213.913934146" lastFinishedPulling="2026-04-24 20:00:40.335587137 +0000 UTC m=+3233.659831279" observedRunningTime="2026-04-24 20:00:40.668257695 +0000 UTC m=+3233.992501880" watchObservedRunningTime="2026-04-24 20:00:40.669153745 +0000 UTC m=+3233.993397908" Apr 24 20:00:41.654283 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:41.654249 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:41.655501 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:41.655473 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:00:42.656684 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:42.656648 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:00:47.660976 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:47.660950 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:00:47.661511 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:47.661485 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:00:57.661932 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:00:57.661891 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:07.662325 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:07.662276 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:17.661580 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:17.661524 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:27.661871 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:27.661827 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:37.661601 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:37.661535 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:47.662404 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:47.662372 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:01:55.367211 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.367176 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:01:55.367668 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.367476 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" containerID="cri-o://0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955" gracePeriod=30 Apr 24 20:01:55.367668 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.367543 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kube-rbac-proxy" containerID="cri-o://6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6" gracePeriod=30 Apr 24 20:01:55.466158 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466121 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:01:55.466405 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466393 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="storage-initializer" Apr 24 20:01:55.466450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466407 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="storage-initializer" Apr 24 20:01:55.466450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466422 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kube-rbac-proxy" Apr 24 20:01:55.466450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466429 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kube-rbac-proxy" Apr 24 20:01:55.466450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466437 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" Apr 24 20:01:55.466450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466443 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" Apr 24 20:01:55.466645 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466493 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kserve-container" Apr 24 20:01:55.466645 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.466501 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="def9c1eb-27f2-4770-9867-122466fc256f" containerName="kube-rbac-proxy" Apr 24 20:01:55.469592 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.469575 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.473799 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.473778 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 20:01:55.474215 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.474194 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 20:01:55.481196 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.481172 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:01:55.547678 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.547646 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.547865 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.547702 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5vdv\" (UniqueName: \"kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.547865 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.547728 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.547865 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.547750 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648198 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648109 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648198 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648156 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b5vdv\" (UniqueName: \"kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648198 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648420 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648495 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648475 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.648537 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:01:55.648510 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-serving-cert: secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 24 20:01:55.648634 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:01:55.648621 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls podName:5d714a28-3874-4872-8b35-1ec72d60f44f nodeName:}" failed. No retries permitted until 2026-04-24 20:01:56.148599328 +0000 UTC m=+3309.472843476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls") pod "isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" (UID: "5d714a28-3874-4872-8b35-1ec72d60f44f") : secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 24 20:01:55.648818 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.648801 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.656867 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.656835 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5vdv\" (UniqueName: \"kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:55.869395 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.869360 2564 generic.go:358] "Generic (PLEG): container finished" podID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerID="6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6" exitCode=2 Apr 24 20:01:55.869596 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:55.869435 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerDied","Data":"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6"} Apr 24 20:01:56.151768 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.151733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:56.154230 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.154210 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:56.380194 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.380152 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:01:56.508593 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.508510 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:01:56.511135 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:01:56.511100 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d714a28_3874_4872_8b35_1ec72d60f44f.slice/crio-fd04e99d0c54af4f6c330c6f538eb291460d4aed9a118057e8207f4095f26b42 WatchSource:0}: Error finding container fd04e99d0c54af4f6c330c6f538eb291460d4aed9a118057e8207f4095f26b42: Status 404 returned error can't find the container with id fd04e99d0c54af4f6c330c6f538eb291460d4aed9a118057e8207f4095f26b42 Apr 24 20:01:56.874441 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.874402 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerStarted","Data":"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8"} Apr 24 20:01:56.874441 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:56.874447 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerStarted","Data":"fd04e99d0c54af4f6c330c6f538eb291460d4aed9a118057e8207f4095f26b42"} Apr 24 20:01:57.657645 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:57.657601 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.55:8643/healthz\": dial tcp 10.132.0.55:8643: connect: connection refused" Apr 24 20:01:57.662111 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:57.662080 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 24 20:01:59.118715 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.118690 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:01:59.276911 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.276882 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls\") pod \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " Apr 24 20:01:59.277120 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.276964 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " Apr 24 20:01:59.277120 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.276998 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location\") pod \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " Apr 24 20:01:59.277120 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.277041 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdtz5\" (UniqueName: \"kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5\") pod \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\" (UID: \"8d5f3449-0cd3-4f31-a32f-d9d6573c3662\") " Apr 24 20:01:59.277324 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.277295 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "8d5f3449-0cd3-4f31-a32f-d9d6573c3662" (UID: "8d5f3449-0cd3-4f31-a32f-d9d6573c3662"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:01:59.277389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.277350 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d5f3449-0cd3-4f31-a32f-d9d6573c3662" (UID: "8d5f3449-0cd3-4f31-a32f-d9d6573c3662"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:01:59.279182 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.279151 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5" (OuterVolumeSpecName: "kube-api-access-bdtz5") pod "8d5f3449-0cd3-4f31-a32f-d9d6573c3662" (UID: "8d5f3449-0cd3-4f31-a32f-d9d6573c3662"). InnerVolumeSpecName "kube-api-access-bdtz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:01:59.279182 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.279156 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "8d5f3449-0cd3-4f31-a32f-d9d6573c3662" (UID: "8d5f3449-0cd3-4f31-a32f-d9d6573c3662"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:01:59.377816 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.377771 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:01:59.377816 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.377808 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:01:59.377816 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.377818 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bdtz5\" (UniqueName: \"kubernetes.io/projected/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-kube-api-access-bdtz5\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:01:59.377816 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.377828 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/8d5f3449-0cd3-4f31-a32f-d9d6573c3662-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:01:59.884072 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.884038 2564 generic.go:358] "Generic (PLEG): container finished" podID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerID="0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955" exitCode=0 Apr 24 20:01:59.884251 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.884125 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" Apr 24 20:01:59.884251 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.884129 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerDied","Data":"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955"} Apr 24 20:01:59.884251 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.884164 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622" event={"ID":"8d5f3449-0cd3-4f31-a32f-d9d6573c3662","Type":"ContainerDied","Data":"441b5e319a3be9522a2c2286ac7498d695209970a44561633a33e4101b56a6fe"} Apr 24 20:01:59.884251 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.884181 2564 scope.go:117] "RemoveContainer" containerID="6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6" Apr 24 20:01:59.892177 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.891963 2564 scope.go:117] "RemoveContainer" containerID="0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955" Apr 24 20:01:59.899298 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.899281 2564 scope.go:117] "RemoveContainer" containerID="7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496" Apr 24 20:01:59.905971 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.905945 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:01:59.907085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907065 2564 scope.go:117] "RemoveContainer" containerID="6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6" Apr 24 20:01:59.907386 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:01:59.907366 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6\": container with ID starting with 6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6 not found: ID does not exist" containerID="6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6" Apr 24 20:01:59.907446 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907395 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6"} err="failed to get container status \"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6\": rpc error: code = NotFound desc = could not find container \"6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6\": container with ID starting with 6a3754c8bcbe707ba0acc534cca4ce09f3aeaccb791763a1cce8415079c0a2c6 not found: ID does not exist" Apr 24 20:01:59.907446 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907417 2564 scope.go:117] "RemoveContainer" containerID="0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955" Apr 24 20:01:59.907734 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:01:59.907703 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955\": container with ID starting with 0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955 not found: ID does not exist" containerID="0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955" Apr 24 20:01:59.907817 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907730 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955"} err="failed to get container status \"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955\": rpc error: code = NotFound desc = could not find container \"0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955\": container with ID starting with 0d0a6ecb40b923cb1f7b4bff2cbbb1a7ef0f687bf13c1749b56d8076b560e955 not found: ID does not exist" Apr 24 20:01:59.907817 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907749 2564 scope.go:117] "RemoveContainer" containerID="7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496" Apr 24 20:01:59.907958 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:01:59.907940 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496\": container with ID starting with 7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496 not found: ID does not exist" containerID="7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496" Apr 24 20:01:59.908001 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.907963 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496"} err="failed to get container status \"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496\": rpc error: code = NotFound desc = could not find container \"7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496\": container with ID starting with 7fd66b5bf81f8911662cb241a520b89488d893014507b16047ee47a959562496 not found: ID does not exist" Apr 24 20:01:59.911525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:01:59.911506 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-89622"] Apr 24 20:02:00.891404 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:00.891370 2564 generic.go:358] "Generic (PLEG): container finished" podID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerID="f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8" exitCode=0 Apr 24 20:02:00.891784 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:00.891443 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerDied","Data":"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8"} Apr 24 20:02:01.184447 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.184367 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" path="/var/lib/kubelet/pods/8d5f3449-0cd3-4f31-a32f-d9d6573c3662/volumes" Apr 24 20:02:01.896366 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.896332 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerStarted","Data":"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f"} Apr 24 20:02:01.896366 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.896372 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerStarted","Data":"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d"} Apr 24 20:02:01.896826 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.896588 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:01.896826 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.896610 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:01.916122 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:01.916073 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" podStartSLOduration=6.916058264 podStartE2EDuration="6.916058264s" podCreationTimestamp="2026-04-24 20:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:02:01.914876259 +0000 UTC m=+3315.239120432" watchObservedRunningTime="2026-04-24 20:02:01.916058264 +0000 UTC m=+3315.240302427" Apr 24 20:02:07.906279 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:07.906197 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:37.919125 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:37.919087 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:45.586676 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.586631 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:02:45.587193 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.586975 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kserve-container" containerID="cri-o://d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d" gracePeriod=30 Apr 24 20:02:45.587193 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.587040 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kube-rbac-proxy" containerID="cri-o://52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f" gracePeriod=30 Apr 24 20:02:45.602785 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.602758 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:02:45.603090 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603077 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="storage-initializer" Apr 24 20:02:45.603133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603092 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="storage-initializer" Apr 24 20:02:45.603133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603117 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" Apr 24 20:02:45.603133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603122 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" Apr 24 20:02:45.603133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603133 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kube-rbac-proxy" Apr 24 20:02:45.603255 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603139 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kube-rbac-proxy" Apr 24 20:02:45.603255 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603188 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kserve-container" Apr 24 20:02:45.603255 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.603196 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d5f3449-0cd3-4f31-a32f-d9d6573c3662" containerName="kube-rbac-proxy" Apr 24 20:02:45.606379 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.606364 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.609545 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.609518 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 24 20:02:45.609663 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.609522 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 24 20:02:45.619546 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.619523 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:02:45.624266 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.624243 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.624403 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.624276 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.624403 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.624295 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lchz\" (UniqueName: \"kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.624521 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.624428 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.725345 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.725303 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.725525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.725362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.725525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.725389 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.725525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.725413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lchz\" (UniqueName: \"kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.725829 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.725794 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.726075 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.726052 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.727908 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.727889 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.735946 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.735921 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lchz\" (UniqueName: \"kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-cv6p7\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:45.916306 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:45.916227 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:46.020860 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:46.020823 2564 generic.go:358] "Generic (PLEG): container finished" podID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerID="52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f" exitCode=2 Apr 24 20:02:46.021016 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:46.020892 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerDied","Data":"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f"} Apr 24 20:02:46.041966 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:46.041939 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:02:46.044266 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:02:46.044233 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod644fb20b_5134_49f6_8316_bbe031aa1c40.slice/crio-c0033f9b301865e1fcff8493b048b82749ca66dfa664570e89665ca9db939c76 WatchSource:0}: Error finding container c0033f9b301865e1fcff8493b048b82749ca66dfa664570e89665ca9db939c76: Status 404 returned error can't find the container with id c0033f9b301865e1fcff8493b048b82749ca66dfa664570e89665ca9db939c76 Apr 24 20:02:47.024806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:47.024759 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerStarted","Data":"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a"} Apr 24 20:02:47.024806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:47.024811 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerStarted","Data":"c0033f9b301865e1fcff8493b048b82749ca66dfa664570e89665ca9db939c76"} Apr 24 20:02:47.900089 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:47.899984 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.56:8643/healthz\": dial tcp 10.132.0.56:8643: connect: connection refused" Apr 24 20:02:50.034355 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:50.034320 2564 generic.go:358] "Generic (PLEG): container finished" podID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerID="7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a" exitCode=0 Apr 24 20:02:50.034764 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:50.034400 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerDied","Data":"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a"} Apr 24 20:02:51.039133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:51.039099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerStarted","Data":"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952"} Apr 24 20:02:51.039133 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:51.039138 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerStarted","Data":"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa"} Apr 24 20:02:51.039644 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:51.039354 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:51.060670 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:51.060614 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" podStartSLOduration=6.06059681 podStartE2EDuration="6.06059681s" podCreationTimestamp="2026-04-24 20:02:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:02:51.060159465 +0000 UTC m=+3364.384403668" watchObservedRunningTime="2026-04-24 20:02:51.06059681 +0000 UTC m=+3364.384840973" Apr 24 20:02:52.043150 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.043120 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:02:52.121793 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:02:52.121744 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d714a28_3874_4872_8b35_1ec72d60f44f.slice/crio-conmon-d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d714a28_3874_4872_8b35_1ec72d60f44f.slice/crio-d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d.scope\": RecentStats: unable to find data in memory cache]" Apr 24 20:02:52.231751 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.231726 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:52.283042 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.282948 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"5d714a28-3874-4872-8b35-1ec72d60f44f\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " Apr 24 20:02:52.283042 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.283014 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") pod \"5d714a28-3874-4872-8b35-1ec72d60f44f\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " Apr 24 20:02:52.283238 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.283075 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5vdv\" (UniqueName: \"kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv\") pod \"5d714a28-3874-4872-8b35-1ec72d60f44f\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " Apr 24 20:02:52.283238 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.283098 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location\") pod \"5d714a28-3874-4872-8b35-1ec72d60f44f\" (UID: \"5d714a28-3874-4872-8b35-1ec72d60f44f\") " Apr 24 20:02:52.283433 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.283338 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "5d714a28-3874-4872-8b35-1ec72d60f44f" (UID: "5d714a28-3874-4872-8b35-1ec72d60f44f"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:02:52.283492 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.283472 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5d714a28-3874-4872-8b35-1ec72d60f44f" (UID: "5d714a28-3874-4872-8b35-1ec72d60f44f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:02:52.285288 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.285255 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5d714a28-3874-4872-8b35-1ec72d60f44f" (UID: "5d714a28-3874-4872-8b35-1ec72d60f44f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:02:52.285288 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.285257 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv" (OuterVolumeSpecName: "kube-api-access-b5vdv") pod "5d714a28-3874-4872-8b35-1ec72d60f44f" (UID: "5d714a28-3874-4872-8b35-1ec72d60f44f"). InnerVolumeSpecName "kube-api-access-b5vdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:02:52.384008 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.383974 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b5vdv\" (UniqueName: \"kubernetes.io/projected/5d714a28-3874-4872-8b35-1ec72d60f44f-kube-api-access-b5vdv\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:02:52.384008 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.384005 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5d714a28-3874-4872-8b35-1ec72d60f44f-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:02:52.384008 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.384016 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5d714a28-3874-4872-8b35-1ec72d60f44f-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:02:52.384267 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:52.384026 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5d714a28-3874-4872-8b35-1ec72d60f44f-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:02:53.049536 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.049503 2564 generic.go:358] "Generic (PLEG): container finished" podID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerID="d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d" exitCode=0 Apr 24 20:02:53.049984 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.049590 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerDied","Data":"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d"} Apr 24 20:02:53.049984 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.049618 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" Apr 24 20:02:53.049984 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.049641 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf" event={"ID":"5d714a28-3874-4872-8b35-1ec72d60f44f","Type":"ContainerDied","Data":"fd04e99d0c54af4f6c330c6f538eb291460d4aed9a118057e8207f4095f26b42"} Apr 24 20:02:53.049984 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.049666 2564 scope.go:117] "RemoveContainer" containerID="52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f" Apr 24 20:02:53.057413 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.057394 2564 scope.go:117] "RemoveContainer" containerID="d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d" Apr 24 20:02:53.064628 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.064611 2564 scope.go:117] "RemoveContainer" containerID="f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8" Apr 24 20:02:53.071257 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.071232 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:02:53.071718 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.071689 2564 scope.go:117] "RemoveContainer" containerID="52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f" Apr 24 20:02:53.071978 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:02:53.071960 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f\": container with ID starting with 52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f not found: ID does not exist" containerID="52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f" Apr 24 20:02:53.072034 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.071986 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f"} err="failed to get container status \"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f\": rpc error: code = NotFound desc = could not find container \"52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f\": container with ID starting with 52bf2394552902947d37fa11c9464a072e6f1ddd44f2ab2d8b40cb2a8dbc502f not found: ID does not exist" Apr 24 20:02:53.072034 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.072002 2564 scope.go:117] "RemoveContainer" containerID="d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d" Apr 24 20:02:53.072233 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:02:53.072215 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d\": container with ID starting with d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d not found: ID does not exist" containerID="d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d" Apr 24 20:02:53.072306 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.072243 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d"} err="failed to get container status \"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d\": rpc error: code = NotFound desc = could not find container \"d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d\": container with ID starting with d79dbad744f1ef4c8b099776d15149310be5daba9cc4464ba6b8c2c94940604d not found: ID does not exist" Apr 24 20:02:53.072306 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.072266 2564 scope.go:117] "RemoveContainer" containerID="f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8" Apr 24 20:02:53.072515 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:02:53.072497 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8\": container with ID starting with f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8 not found: ID does not exist" containerID="f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8" Apr 24 20:02:53.072582 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.072519 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8"} err="failed to get container status \"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8\": rpc error: code = NotFound desc = could not find container \"f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8\": container with ID starting with f01b07b518acbbc7e1a3af4a8fdd9c4afd7832348b138bafb45a9f84ba6676a8 not found: ID does not exist" Apr 24 20:02:53.075379 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.075357 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-4dwbf"] Apr 24 20:02:53.184044 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:53.184011 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" path="/var/lib/kubelet/pods/5d714a28-3874-4872-8b35-1ec72d60f44f/volumes" Apr 24 20:02:58.055518 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:02:58.055488 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:03:28.059158 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:28.059117 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:03:35.674336 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.674300 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:03:35.674927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.674625 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kserve-container" containerID="cri-o://d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa" gracePeriod=30 Apr 24 20:03:35.674927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.674672 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kube-rbac-proxy" containerID="cri-o://87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952" gracePeriod=30 Apr 24 20:03:35.765932 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.765894 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:03:35.766177 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766165 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kserve-container" Apr 24 20:03:35.766234 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766179 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kserve-container" Apr 24 20:03:35.766234 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766191 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="storage-initializer" Apr 24 20:03:35.766234 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766197 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="storage-initializer" Apr 24 20:03:35.766234 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766207 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kube-rbac-proxy" Apr 24 20:03:35.766234 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766213 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kube-rbac-proxy" Apr 24 20:03:35.766391 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766267 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kserve-container" Apr 24 20:03:35.766391 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.766280 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d714a28-3874-4872-8b35-1ec72d60f44f" containerName="kube-rbac-proxy" Apr 24 20:03:35.769456 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.769439 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:35.771941 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.771921 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 24 20:03:35.772037 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.771925 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 24 20:03:35.780959 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.780926 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:03:35.930678 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.930576 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dv2qs\" (UniqueName: \"kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:35.930835 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.930679 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:35.930835 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.930745 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:35.930835 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:35.930821 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032153 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032066 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032153 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032111 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032153 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032144 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032375 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032174 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dv2qs\" (UniqueName: \"kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032502 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032482 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.032895 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.032876 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.034834 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.034810 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.040433 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.040407 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dv2qs\" (UniqueName: \"kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs\") pod \"isvc-xgboost-runtime-predictor-779db84d9-lq56x\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.079531 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.079492 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:36.179784 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.179753 2564 generic.go:358] "Generic (PLEG): container finished" podID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerID="87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952" exitCode=2 Apr 24 20:03:36.179977 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.179825 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerDied","Data":"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952"} Apr 24 20:03:36.202077 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:36.201990 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:03:36.204687 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:03:36.204662 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58fa2331_6d65_4820_b748_c4151cebf06b.slice/crio-4ad6b770fd23999f8e748a473e0ea1e63506eb0625b45eef1778788509e11210 WatchSource:0}: Error finding container 4ad6b770fd23999f8e748a473e0ea1e63506eb0625b45eef1778788509e11210: Status 404 returned error can't find the container with id 4ad6b770fd23999f8e748a473e0ea1e63506eb0625b45eef1778788509e11210 Apr 24 20:03:37.187222 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:37.187185 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerStarted","Data":"607dddfb426fa8cb1a4a338cd719c3d347718e36a4a21ac45caba4217a80d38b"} Apr 24 20:03:37.187222 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:37.187224 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerStarted","Data":"4ad6b770fd23999f8e748a473e0ea1e63506eb0625b45eef1778788509e11210"} Apr 24 20:03:38.050740 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:38.050691 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.57:8643/healthz\": dial tcp 10.132.0.57:8643: connect: connection refused" Apr 24 20:03:40.196836 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:40.196790 2564 generic.go:358] "Generic (PLEG): container finished" podID="58fa2331-6d65-4820-b748-c4151cebf06b" containerID="607dddfb426fa8cb1a4a338cd719c3d347718e36a4a21ac45caba4217a80d38b" exitCode=0 Apr 24 20:03:40.197322 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:40.196860 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerDied","Data":"607dddfb426fa8cb1a4a338cd719c3d347718e36a4a21ac45caba4217a80d38b"} Apr 24 20:03:41.201269 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:41.201233 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerStarted","Data":"1e210768aa26e999ddc6c20b43f42edfb20ac1eabcd3e8c9266b45790fc12e2d"} Apr 24 20:03:41.201269 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:41.201276 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerStarted","Data":"e27e9406e9867f759c4cf067cde35ac1584dd96ea4cf01c4ec435e2a784e3f4c"} Apr 24 20:03:41.201980 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:41.201507 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:41.221517 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:41.221452 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podStartSLOduration=6.221432692 podStartE2EDuration="6.221432692s" podCreationTimestamp="2026-04-24 20:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:03:41.220698235 +0000 UTC m=+3414.544942399" watchObservedRunningTime="2026-04-24 20:03:41.221432692 +0000 UTC m=+3414.545676853" Apr 24 20:03:42.203931 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.203901 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:42.205228 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.205201 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:03:42.503487 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.503464 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:03:42.687334 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687293 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"644fb20b-5134-49f6-8316-bbe031aa1c40\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " Apr 24 20:03:42.687334 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687333 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lchz\" (UniqueName: \"kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz\") pod \"644fb20b-5134-49f6-8316-bbe031aa1c40\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " Apr 24 20:03:42.687616 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687436 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location\") pod \"644fb20b-5134-49f6-8316-bbe031aa1c40\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " Apr 24 20:03:42.687616 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687463 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls\") pod \"644fb20b-5134-49f6-8316-bbe031aa1c40\" (UID: \"644fb20b-5134-49f6-8316-bbe031aa1c40\") " Apr 24 20:03:42.687730 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687673 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "644fb20b-5134-49f6-8316-bbe031aa1c40" (UID: "644fb20b-5134-49f6-8316-bbe031aa1c40"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:03:42.687886 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.687860 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "644fb20b-5134-49f6-8316-bbe031aa1c40" (UID: "644fb20b-5134-49f6-8316-bbe031aa1c40"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:03:42.689539 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.689515 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "644fb20b-5134-49f6-8316-bbe031aa1c40" (UID: "644fb20b-5134-49f6-8316-bbe031aa1c40"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:03:42.689661 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.689565 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz" (OuterVolumeSpecName: "kube-api-access-4lchz") pod "644fb20b-5134-49f6-8316-bbe031aa1c40" (UID: "644fb20b-5134-49f6-8316-bbe031aa1c40"). InnerVolumeSpecName "kube-api-access-4lchz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:03:42.788682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.788600 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/644fb20b-5134-49f6-8316-bbe031aa1c40-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:03:42.788682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.788627 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/644fb20b-5134-49f6-8316-bbe031aa1c40-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:03:42.788682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.788638 2564 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/644fb20b-5134-49f6-8316-bbe031aa1c40-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:03:42.788682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:42.788649 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lchz\" (UniqueName: \"kubernetes.io/projected/644fb20b-5134-49f6-8316-bbe031aa1c40-kube-api-access-4lchz\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:03:43.207892 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.207852 2564 generic.go:358] "Generic (PLEG): container finished" podID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerID="d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa" exitCode=0 Apr 24 20:03:43.208287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.207933 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerDied","Data":"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa"} Apr 24 20:03:43.208287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.207970 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" event={"ID":"644fb20b-5134-49f6-8316-bbe031aa1c40","Type":"ContainerDied","Data":"c0033f9b301865e1fcff8493b048b82749ca66dfa664570e89665ca9db939c76"} Apr 24 20:03:43.208287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.207942 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7" Apr 24 20:03:43.208287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.207990 2564 scope.go:117] "RemoveContainer" containerID="87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952" Apr 24 20:03:43.208287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.208204 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:03:43.215743 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.215722 2564 scope.go:117] "RemoveContainer" containerID="d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa" Apr 24 20:03:43.223244 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.223214 2564 scope.go:117] "RemoveContainer" containerID="7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a" Apr 24 20:03:43.227777 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.227115 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:03:43.230825 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.230767 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-cv6p7"] Apr 24 20:03:43.235332 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.235308 2564 scope.go:117] "RemoveContainer" containerID="87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952" Apr 24 20:03:43.235640 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:03:43.235619 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952\": container with ID starting with 87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952 not found: ID does not exist" containerID="87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952" Apr 24 20:03:43.235689 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.235649 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952"} err="failed to get container status \"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952\": rpc error: code = NotFound desc = could not find container \"87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952\": container with ID starting with 87fd4f9a49cdfcdfa6b722570c052bddc669690fe928bd819b788a6a9ed54952 not found: ID does not exist" Apr 24 20:03:43.235689 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.235668 2564 scope.go:117] "RemoveContainer" containerID="d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa" Apr 24 20:03:43.235943 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:03:43.235925 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa\": container with ID starting with d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa not found: ID does not exist" containerID="d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa" Apr 24 20:03:43.236028 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.235948 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa"} err="failed to get container status \"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa\": rpc error: code = NotFound desc = could not find container \"d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa\": container with ID starting with d1f3cb2c233efb8e0094d41664543192000c8d8a1938fcf0733121aac678e2aa not found: ID does not exist" Apr 24 20:03:43.236028 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.235962 2564 scope.go:117] "RemoveContainer" containerID="7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a" Apr 24 20:03:43.236204 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:03:43.236186 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a\": container with ID starting with 7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a not found: ID does not exist" containerID="7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a" Apr 24 20:03:43.236244 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:43.236211 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a"} err="failed to get container status \"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a\": rpc error: code = NotFound desc = could not find container \"7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a\": container with ID starting with 7bfdf9f93a929b9ca59af21e777b9eab82cfae8c4e04cc4e48c77be45761706a not found: ID does not exist" Apr 24 20:03:45.184291 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:45.184250 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" path="/var/lib/kubelet/pods/644fb20b-5134-49f6-8316-bbe031aa1c40/volumes" Apr 24 20:03:48.213100 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:48.213073 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:03:48.213683 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:48.213652 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:03:58.214292 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:03:58.214247 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:08.213738 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:08.213697 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:18.214399 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:18.214356 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:28.214501 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:28.214461 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:38.213736 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:38.213695 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:48.214444 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:48.214405 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:04:55.915073 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.915038 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:04:55.915672 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.915376 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" containerID="cri-o://e27e9406e9867f759c4cf067cde35ac1584dd96ea4cf01c4ec435e2a784e3f4c" gracePeriod=30 Apr 24 20:04:55.915672 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.915414 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kube-rbac-proxy" containerID="cri-o://1e210768aa26e999ddc6c20b43f42edfb20ac1eabcd3e8c9266b45790fc12e2d" gracePeriod=30 Apr 24 20:04:55.980615 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980577 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:04:55.980878 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980864 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="storage-initializer" Apr 24 20:04:55.980927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980881 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="storage-initializer" Apr 24 20:04:55.980927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980892 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kserve-container" Apr 24 20:04:55.980927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980898 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kserve-container" Apr 24 20:04:55.980927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980910 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kube-rbac-proxy" Apr 24 20:04:55.980927 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980916 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kube-rbac-proxy" Apr 24 20:04:55.981086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980969 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kube-rbac-proxy" Apr 24 20:04:55.981086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.980979 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="644fb20b-5134-49f6-8316-bbe031aa1c40" containerName="kserve-container" Apr 24 20:04:55.984221 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.984195 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:55.986451 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.986430 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 24 20:04:55.986576 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.986456 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 24 20:04:55.992890 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:55.992867 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:04:56.140144 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.140104 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.140370 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.140151 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrpvj\" (UniqueName: \"kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.140370 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.140221 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.140370 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.140278 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.241689 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.241651 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.241877 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.241721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.241877 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.241749 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrpvj\" (UniqueName: \"kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.241877 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.241775 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.242228 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.242201 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.242448 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.242425 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.244238 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.244215 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.249867 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.249841 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrpvj\" (UniqueName: \"kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.295827 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.295777 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:04:56.410648 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.410613 2564 generic.go:358] "Generic (PLEG): container finished" podID="58fa2331-6d65-4820-b748-c4151cebf06b" containerID="1e210768aa26e999ddc6c20b43f42edfb20ac1eabcd3e8c9266b45790fc12e2d" exitCode=2 Apr 24 20:04:56.410803 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.410683 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerDied","Data":"1e210768aa26e999ddc6c20b43f42edfb20ac1eabcd3e8c9266b45790fc12e2d"} Apr 24 20:04:56.417609 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:56.417582 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:04:56.420473 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:04:56.420449 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda99034c_3de5_4ced_b2ca_3d3ce20255f6.slice/crio-f836cafebeddca7de6dd1f038aa11068f2b9a05305ed8c4995844dc4ec65ddd2 WatchSource:0}: Error finding container f836cafebeddca7de6dd1f038aa11068f2b9a05305ed8c4995844dc4ec65ddd2: Status 404 returned error can't find the container with id f836cafebeddca7de6dd1f038aa11068f2b9a05305ed8c4995844dc4ec65ddd2 Apr 24 20:04:57.414626 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:57.414590 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerStarted","Data":"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39"} Apr 24 20:04:57.414626 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:57.414628 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerStarted","Data":"f836cafebeddca7de6dd1f038aa11068f2b9a05305ed8c4995844dc4ec65ddd2"} Apr 24 20:04:58.209281 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:58.209239 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.58:8643/healthz\": dial tcp 10.132.0.58:8643: connect: connection refused" Apr 24 20:04:58.214589 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:58.214564 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 24 20:04:59.422393 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.422362 2564 generic.go:358] "Generic (PLEG): container finished" podID="58fa2331-6d65-4820-b748-c4151cebf06b" containerID="e27e9406e9867f759c4cf067cde35ac1584dd96ea4cf01c4ec435e2a784e3f4c" exitCode=0 Apr 24 20:04:59.422393 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.422401 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerDied","Data":"e27e9406e9867f759c4cf067cde35ac1584dd96ea4cf01c4ec435e2a784e3f4c"} Apr 24 20:04:59.456863 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.456841 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:04:59.568865 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.568831 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"58fa2331-6d65-4820-b748-c4151cebf06b\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " Apr 24 20:04:59.569045 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.568886 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location\") pod \"58fa2331-6d65-4820-b748-c4151cebf06b\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " Apr 24 20:04:59.569045 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.568933 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dv2qs\" (UniqueName: \"kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs\") pod \"58fa2331-6d65-4820-b748-c4151cebf06b\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " Apr 24 20:04:59.569045 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.568953 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls\") pod \"58fa2331-6d65-4820-b748-c4151cebf06b\" (UID: \"58fa2331-6d65-4820-b748-c4151cebf06b\") " Apr 24 20:04:59.569284 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.569227 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58fa2331-6d65-4820-b748-c4151cebf06b" (UID: "58fa2331-6d65-4820-b748-c4151cebf06b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:04:59.569284 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.569237 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "58fa2331-6d65-4820-b748-c4151cebf06b" (UID: "58fa2331-6d65-4820-b748-c4151cebf06b"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:04:59.571100 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.571071 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs" (OuterVolumeSpecName: "kube-api-access-dv2qs") pod "58fa2331-6d65-4820-b748-c4151cebf06b" (UID: "58fa2331-6d65-4820-b748-c4151cebf06b"). InnerVolumeSpecName "kube-api-access-dv2qs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:04:59.571100 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.571088 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "58fa2331-6d65-4820-b748-c4151cebf06b" (UID: "58fa2331-6d65-4820-b748-c4151cebf06b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:04:59.669681 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.669629 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/58fa2331-6d65-4820-b748-c4151cebf06b-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:04:59.669681 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.669675 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58fa2331-6d65-4820-b748-c4151cebf06b-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:04:59.669681 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.669687 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dv2qs\" (UniqueName: \"kubernetes.io/projected/58fa2331-6d65-4820-b748-c4151cebf06b-kube-api-access-dv2qs\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:04:59.669681 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:04:59.669696 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58fa2331-6d65-4820-b748-c4151cebf06b-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:05:00.426649 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.426530 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" event={"ID":"58fa2331-6d65-4820-b748-c4151cebf06b","Type":"ContainerDied","Data":"4ad6b770fd23999f8e748a473e0ea1e63506eb0625b45eef1778788509e11210"} Apr 24 20:05:00.426649 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.426581 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x" Apr 24 20:05:00.426649 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.426608 2564 scope.go:117] "RemoveContainer" containerID="1e210768aa26e999ddc6c20b43f42edfb20ac1eabcd3e8c9266b45790fc12e2d" Apr 24 20:05:00.428001 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.427979 2564 generic.go:358] "Generic (PLEG): container finished" podID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerID="48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39" exitCode=0 Apr 24 20:05:00.428099 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.428030 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerDied","Data":"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39"} Apr 24 20:05:00.435305 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.435284 2564 scope.go:117] "RemoveContainer" containerID="e27e9406e9867f759c4cf067cde35ac1584dd96ea4cf01c4ec435e2a784e3f4c" Apr 24 20:05:00.442613 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.442568 2564 scope.go:117] "RemoveContainer" containerID="607dddfb426fa8cb1a4a338cd719c3d347718e36a4a21ac45caba4217a80d38b" Apr 24 20:05:00.459735 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.459713 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:05:00.462864 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:00.462841 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-lq56x"] Apr 24 20:05:01.184165 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:01.184129 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" path="/var/lib/kubelet/pods/58fa2331-6d65-4820-b748-c4151cebf06b/volumes" Apr 24 20:05:01.433306 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:01.433274 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerStarted","Data":"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c"} Apr 24 20:05:01.433799 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:01.433319 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerStarted","Data":"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1"} Apr 24 20:05:01.433799 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:01.433586 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:05:01.451934 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:01.451888 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podStartSLOduration=6.451873335 podStartE2EDuration="6.451873335s" podCreationTimestamp="2026-04-24 20:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:05:01.449742738 +0000 UTC m=+3494.773986901" watchObservedRunningTime="2026-04-24 20:05:01.451873335 +0000 UTC m=+3494.776117498" Apr 24 20:05:02.436704 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:02.436674 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:05:08.445608 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:08.445513 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:05:38.518582 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:38.518523 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 24 20:05:48.448496 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:48.448469 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:05:56.037399 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.037364 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:05:56.037901 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.037803 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" containerID="cri-o://72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1" gracePeriod=30 Apr 24 20:05:56.038033 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.037963 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" containerID="cri-o://cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c" gracePeriod=30 Apr 24 20:05:56.061229 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:05:56.061199 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda99034c_3de5_4ced_b2ca_3d3ce20255f6.slice/crio-cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c.scope\": RecentStats: unable to find data in memory cache]" Apr 24 20:05:56.117002 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.116964 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:05:56.117294 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117281 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" Apr 24 20:05:56.117350 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117296 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" Apr 24 20:05:56.117350 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117305 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="storage-initializer" Apr 24 20:05:56.117350 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117312 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="storage-initializer" Apr 24 20:05:56.117350 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117319 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kube-rbac-proxy" Apr 24 20:05:56.117350 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117325 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kube-rbac-proxy" Apr 24 20:05:56.117511 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117372 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kube-rbac-proxy" Apr 24 20:05:56.117511 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.117383 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="58fa2331-6d65-4820-b748-c4151cebf06b" containerName="kserve-container" Apr 24 20:05:56.121634 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.121609 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.123896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.123870 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 24 20:05:56.124016 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.123972 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 24 20:05:56.130465 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.130439 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:05:56.192372 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.192328 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.192599 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.192408 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8hp\" (UniqueName: \"kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.192599 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.192444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.192599 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.192479 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.293934 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.293833 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8hp\" (UniqueName: \"kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.293934 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.293879 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.294187 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.294007 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.294187 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.294069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.294406 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.294388 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.294744 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.294722 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.296463 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.296441 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.301508 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.301486 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8hp\" (UniqueName: \"kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-stx4q\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.434110 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.434070 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:05:56.557936 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.557733 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:05:56.560647 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:05:56.560619 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4cd865c2_1061_4df3_b73c_3d9e9c828ee3.slice/crio-701c00c92e813fba7176ca1e99d4e7e412576a466f4a76826711ccfd405f572c WatchSource:0}: Error finding container 701c00c92e813fba7176ca1e99d4e7e412576a466f4a76826711ccfd405f572c: Status 404 returned error can't find the container with id 701c00c92e813fba7176ca1e99d4e7e412576a466f4a76826711ccfd405f572c Apr 24 20:05:56.562367 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.562351 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 20:05:56.596644 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.596609 2564 generic.go:358] "Generic (PLEG): container finished" podID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerID="cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c" exitCode=2 Apr 24 20:05:56.596794 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.596681 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerDied","Data":"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c"} Apr 24 20:05:56.597817 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:56.597788 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerStarted","Data":"701c00c92e813fba7176ca1e99d4e7e412576a466f4a76826711ccfd405f572c"} Apr 24 20:05:57.601987 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:57.601953 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerStarted","Data":"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d"} Apr 24 20:05:58.440867 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:58.440827 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 24 20:05:59.487755 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:05:59.487709 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.59:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 24 20:06:00.611723 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:00.611688 2564 generic.go:358] "Generic (PLEG): container finished" podID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerID="12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d" exitCode=0 Apr 24 20:06:00.612118 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:00.611760 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerDied","Data":"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d"} Apr 24 20:06:01.616651 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:01.616610 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerStarted","Data":"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e"} Apr 24 20:06:01.617097 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:01.616666 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerStarted","Data":"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b"} Apr 24 20:06:01.617097 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:01.616951 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:06:01.636082 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:01.636025 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podStartSLOduration=5.636009369 podStartE2EDuration="5.636009369s" podCreationTimestamp="2026-04-24 20:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:06:01.634365189 +0000 UTC m=+3554.958609356" watchObservedRunningTime="2026-04-24 20:06:01.636009369 +0000 UTC m=+3554.960253531" Apr 24 20:06:02.619351 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:02.619315 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:06:02.620610 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:02.620583 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:03.440248 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:03.440202 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 24 20:06:03.623435 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:03.623391 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:04.289624 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.289599 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:06:04.353869 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.353831 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location\") pod \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " Apr 24 20:06:04.353869 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.353873 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls\") pod \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " Apr 24 20:06:04.354142 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.353898 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " Apr 24 20:06:04.354142 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.353934 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrpvj\" (UniqueName: \"kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj\") pod \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\" (UID: \"da99034c-3de5-4ced-b2ca-3d3ce20255f6\") " Apr 24 20:06:04.354294 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.354264 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da99034c-3de5-4ced-b2ca-3d3ce20255f6" (UID: "da99034c-3de5-4ced-b2ca-3d3ce20255f6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:06:04.354347 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.354289 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "da99034c-3de5-4ced-b2ca-3d3ce20255f6" (UID: "da99034c-3de5-4ced-b2ca-3d3ce20255f6"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:06:04.355959 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.355936 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da99034c-3de5-4ced-b2ca-3d3ce20255f6" (UID: "da99034c-3de5-4ced-b2ca-3d3ce20255f6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:06:04.356068 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.356033 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj" (OuterVolumeSpecName: "kube-api-access-mrpvj") pod "da99034c-3de5-4ced-b2ca-3d3ce20255f6" (UID: "da99034c-3de5-4ced-b2ca-3d3ce20255f6"). InnerVolumeSpecName "kube-api-access-mrpvj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:06:04.454762 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.454730 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mrpvj\" (UniqueName: \"kubernetes.io/projected/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kube-api-access-mrpvj\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:06:04.454762 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.454760 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da99034c-3de5-4ced-b2ca-3d3ce20255f6-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:06:04.454762 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.454770 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da99034c-3de5-4ced-b2ca-3d3ce20255f6-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:06:04.454997 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.454780 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da99034c-3de5-4ced-b2ca-3d3ce20255f6-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:06:04.628496 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.628459 2564 generic.go:358] "Generic (PLEG): container finished" podID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerID="72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1" exitCode=0 Apr 24 20:06:04.628972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.628534 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" Apr 24 20:06:04.628972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.628538 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerDied","Data":"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1"} Apr 24 20:06:04.628972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.628593 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw" event={"ID":"da99034c-3de5-4ced-b2ca-3d3ce20255f6","Type":"ContainerDied","Data":"f836cafebeddca7de6dd1f038aa11068f2b9a05305ed8c4995844dc4ec65ddd2"} Apr 24 20:06:04.628972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.628609 2564 scope.go:117] "RemoveContainer" containerID="cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c" Apr 24 20:06:04.636466 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.636449 2564 scope.go:117] "RemoveContainer" containerID="72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1" Apr 24 20:06:04.643660 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.643642 2564 scope.go:117] "RemoveContainer" containerID="48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39" Apr 24 20:06:04.649799 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.649771 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:06:04.650759 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.650743 2564 scope.go:117] "RemoveContainer" containerID="cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c" Apr 24 20:06:04.650994 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:06:04.650978 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c\": container with ID starting with cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c not found: ID does not exist" containerID="cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c" Apr 24 20:06:04.651050 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.651002 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c"} err="failed to get container status \"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c\": rpc error: code = NotFound desc = could not find container \"cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c\": container with ID starting with cca943b796e2d52000546c7bc94a3e785f23784687888ae44b6321bb1435236c not found: ID does not exist" Apr 24 20:06:04.651050 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.651019 2564 scope.go:117] "RemoveContainer" containerID="72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1" Apr 24 20:06:04.651366 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:06:04.651342 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1\": container with ID starting with 72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1 not found: ID does not exist" containerID="72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1" Apr 24 20:06:04.651464 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.651383 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1"} err="failed to get container status \"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1\": rpc error: code = NotFound desc = could not find container \"72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1\": container with ID starting with 72c621b070a1cee24f2d83f15db05f35788374a3779ade38bcb4ea3984c058a1 not found: ID does not exist" Apr 24 20:06:04.651464 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.651406 2564 scope.go:117] "RemoveContainer" containerID="48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39" Apr 24 20:06:04.651779 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:06:04.651729 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39\": container with ID starting with 48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39 not found: ID does not exist" containerID="48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39" Apr 24 20:06:04.651779 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.651767 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39"} err="failed to get container status \"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39\": rpc error: code = NotFound desc = could not find container \"48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39\": container with ID starting with 48f7c07a60db47df35281c20b848427edb69e89a6eae62f410dc7ea36d6bab39 not found: ID does not exist" Apr 24 20:06:04.652859 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:04.652839 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-nt9mw"] Apr 24 20:06:05.184261 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:05.184224 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" path="/var/lib/kubelet/pods/da99034c-3de5-4ced-b2ca-3d3ce20255f6/volumes" Apr 24 20:06:08.628697 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:08.628661 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:06:08.629282 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:08.629254 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:18.629135 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:18.629092 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:28.629352 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:28.629292 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:38.629236 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:38.629151 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:48.630183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:48.630143 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:06:58.630195 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:06:58.630098 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:07:08.630644 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:08.630610 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:07:16.336346 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336302 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336621 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336635 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336658 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="storage-initializer" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336664 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="storage-initializer" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336670 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336676 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336725 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kserve-container" Apr 24 20:07:16.336806 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.336737 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="da99034c-3de5-4ced-b2ca-3d3ce20255f6" containerName="kube-rbac-proxy" Apr 24 20:07:16.339793 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.339777 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.342945 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.342911 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 24 20:07:16.343537 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.343514 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 24 20:07:16.343690 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.343591 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 24 20:07:16.353496 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.353470 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:07:16.376562 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.376520 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwq2w\" (UniqueName: \"kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.376747 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.376606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.376747 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.376631 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.376747 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.376698 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.381074 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.381050 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:07:16.381389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.381364 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" containerID="cri-o://229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b" gracePeriod=30 Apr 24 20:07:16.381475 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.381400 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kube-rbac-proxy" containerID="cri-o://3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e" gracePeriod=30 Apr 24 20:07:16.477115 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477072 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.477115 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477122 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.477365 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477149 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.477365 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477192 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwq2w\" (UniqueName: \"kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.477365 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:07:16.477304 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-predictor-serving-cert: secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 20:07:16.477487 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:07:16.477377 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls podName:97c946c0-69cd-41fb-af3f-231b6257a7af nodeName:}" failed. No retries permitted until 2026-04-24 20:07:16.97736168 +0000 UTC m=+3630.301605826 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls") pod "isvc-sklearn-s3-predictor-5f67dfb489-bn47d" (UID: "97c946c0-69cd-41fb-af3f-231b6257a7af") : secret "isvc-sklearn-s3-predictor-serving-cert" not found Apr 24 20:07:16.477643 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.477852 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.477835 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.485421 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.485393 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwq2w\" (UniqueName: \"kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.835861 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.835825 2564 generic.go:358] "Generic (PLEG): container finished" podID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerID="3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e" exitCode=2 Apr 24 20:07:16.836009 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.835885 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerDied","Data":"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e"} Apr 24 20:07:16.981813 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.981767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:16.984188 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:16.984162 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") pod \"isvc-sklearn-s3-predictor-5f67dfb489-bn47d\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:17.252293 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:17.252249 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:17.375566 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:17.375445 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:07:17.378337 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:07:17.378307 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c946c0_69cd_41fb_af3f_231b6257a7af.slice/crio-278f127ae8e43ffc8dce98b50e69743632aaf6639d48cda939435915e52ecdef WatchSource:0}: Error finding container 278f127ae8e43ffc8dce98b50e69743632aaf6639d48cda939435915e52ecdef: Status 404 returned error can't find the container with id 278f127ae8e43ffc8dce98b50e69743632aaf6639d48cda939435915e52ecdef Apr 24 20:07:17.840132 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:17.840093 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerStarted","Data":"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc"} Apr 24 20:07:17.840132 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:17.840136 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerStarted","Data":"278f127ae8e43ffc8dce98b50e69743632aaf6639d48cda939435915e52ecdef"} Apr 24 20:07:18.623799 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:18.623756 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.60:8643/healthz\": dial tcp 10.132.0.60:8643: connect: connection refused" Apr 24 20:07:18.630186 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:18.630149 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 24 20:07:18.844491 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:18.844403 2564 generic.go:358] "Generic (PLEG): container finished" podID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerID="89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc" exitCode=0 Apr 24 20:07:18.844491 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:18.844456 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerDied","Data":"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc"} Apr 24 20:07:19.849760 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:19.849722 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerStarted","Data":"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837"} Apr 24 20:07:19.850149 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:19.849766 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerStarted","Data":"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214"} Apr 24 20:07:19.850149 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:19.849868 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:19.869029 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:19.868981 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podStartSLOduration=3.868966129 podStartE2EDuration="3.868966129s" podCreationTimestamp="2026-04-24 20:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:07:19.867391412 +0000 UTC m=+3633.191635574" watchObservedRunningTime="2026-04-24 20:07:19.868966129 +0000 UTC m=+3633.193210594" Apr 24 20:07:20.126165 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.126141 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:07:20.206064 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206026 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location\") pod \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " Apr 24 20:07:20.206264 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206095 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv8hp\" (UniqueName: \"kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp\") pod \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " Apr 24 20:07:20.206264 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206168 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " Apr 24 20:07:20.206264 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206202 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls\") pod \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\" (UID: \"4cd865c2-1061-4df3-b73c-3d9e9c828ee3\") " Apr 24 20:07:20.206444 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206415 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4cd865c2-1061-4df3-b73c-3d9e9c828ee3" (UID: "4cd865c2-1061-4df3-b73c-3d9e9c828ee3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:07:20.206497 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.206475 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "4cd865c2-1061-4df3-b73c-3d9e9c828ee3" (UID: "4cd865c2-1061-4df3-b73c-3d9e9c828ee3"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:07:20.208154 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.208127 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4cd865c2-1061-4df3-b73c-3d9e9c828ee3" (UID: "4cd865c2-1061-4df3-b73c-3d9e9c828ee3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:07:20.208270 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.208197 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp" (OuterVolumeSpecName: "kube-api-access-gv8hp") pod "4cd865c2-1061-4df3-b73c-3d9e9c828ee3" (UID: "4cd865c2-1061-4df3-b73c-3d9e9c828ee3"). InnerVolumeSpecName "kube-api-access-gv8hp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:07:20.307729 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.307683 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:07:20.307729 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.307719 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:07:20.307729 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.307729 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:07:20.307729 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.307740 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gv8hp\" (UniqueName: \"kubernetes.io/projected/4cd865c2-1061-4df3-b73c-3d9e9c828ee3-kube-api-access-gv8hp\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:07:20.854408 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.854373 2564 generic.go:358] "Generic (PLEG): container finished" podID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerID="229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b" exitCode=0 Apr 24 20:07:20.854845 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.854466 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" Apr 24 20:07:20.854845 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.854465 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerDied","Data":"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b"} Apr 24 20:07:20.854845 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.854511 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q" event={"ID":"4cd865c2-1061-4df3-b73c-3d9e9c828ee3","Type":"ContainerDied","Data":"701c00c92e813fba7176ca1e99d4e7e412576a466f4a76826711ccfd405f572c"} Apr 24 20:07:20.854845 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.854532 2564 scope.go:117] "RemoveContainer" containerID="3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e" Apr 24 20:07:20.855183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.855155 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:20.856587 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.856542 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:07:20.862783 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.862573 2564 scope.go:117] "RemoveContainer" containerID="229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b" Apr 24 20:07:20.869569 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.869538 2564 scope.go:117] "RemoveContainer" containerID="12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d" Apr 24 20:07:20.876028 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.876003 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:07:20.876632 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.876617 2564 scope.go:117] "RemoveContainer" containerID="3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e" Apr 24 20:07:20.876949 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:07:20.876931 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e\": container with ID starting with 3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e not found: ID does not exist" containerID="3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e" Apr 24 20:07:20.877015 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.876958 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e"} err="failed to get container status \"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e\": rpc error: code = NotFound desc = could not find container \"3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e\": container with ID starting with 3a7632c741c7fd2ba7321c8baeef71f7337d04a87b8b06e21ef4aef92af1a66e not found: ID does not exist" Apr 24 20:07:20.877015 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.876978 2564 scope.go:117] "RemoveContainer" containerID="229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b" Apr 24 20:07:20.877207 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:07:20.877192 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b\": container with ID starting with 229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b not found: ID does not exist" containerID="229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b" Apr 24 20:07:20.877307 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.877210 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b"} err="failed to get container status \"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b\": rpc error: code = NotFound desc = could not find container \"229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b\": container with ID starting with 229debff2665ac3efd1cc066af9d024d9c4b7a273434183d0cc4ed056311224b not found: ID does not exist" Apr 24 20:07:20.877307 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.877224 2564 scope.go:117] "RemoveContainer" containerID="12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d" Apr 24 20:07:20.877468 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:07:20.877447 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d\": container with ID starting with 12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d not found: ID does not exist" containerID="12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d" Apr 24 20:07:20.877515 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.877473 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d"} err="failed to get container status \"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d\": rpc error: code = NotFound desc = could not find container \"12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d\": container with ID starting with 12c1121da513e65fdf828258647a3db958d5830ce2b724bfb14bb2c5d2d9940d not found: ID does not exist" Apr 24 20:07:20.879346 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:20.879323 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-stx4q"] Apr 24 20:07:21.184507 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:21.184422 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" path="/var/lib/kubelet/pods/4cd865c2-1061-4df3-b73c-3d9e9c828ee3/volumes" Apr 24 20:07:21.858766 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:21.858723 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:07:26.863615 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:26.863584 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:07:26.864064 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:26.864036 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:07:36.868264 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:36.868105 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:07:46.864695 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:46.864652 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:07:56.864231 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:07:56.864191 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:08:06.865055 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:06.864971 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:08:16.864083 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:16.864039 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:08:26.865360 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:26.865330 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:08:36.440986 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.440946 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:08:36.441510 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.441307 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" containerID="cri-o://944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214" gracePeriod=30 Apr 24 20:08:36.441510 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.441408 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kube-rbac-proxy" containerID="cri-o://d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837" gracePeriod=30 Apr 24 20:08:36.555274 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555238 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:08:36.555621 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555586 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kube-rbac-proxy" Apr 24 20:08:36.555621 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555602 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kube-rbac-proxy" Apr 24 20:08:36.555621 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555619 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="storage-initializer" Apr 24 20:08:36.555621 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555624 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="storage-initializer" Apr 24 20:08:36.555776 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555632 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" Apr 24 20:08:36.555776 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555637 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" Apr 24 20:08:36.555776 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555683 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kube-rbac-proxy" Apr 24 20:08:36.555776 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.555694 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4cd865c2-1061-4df3-b73c-3d9e9c828ee3" containerName="kserve-container" Apr 24 20:08:36.558602 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.558581 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.560959 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.560930 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 24 20:08:36.561092 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.560987 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 24 20:08:36.561231 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.561211 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 20:08:36.567663 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.567642 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:08:36.598240 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.598213 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.598400 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.598245 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.598400 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.598273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sblr\" (UniqueName: \"kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.598400 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.598313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.598400 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.598336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699214 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699183 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699399 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699399 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:08:36.699336 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 20:08:36.699530 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2sblr\" (UniqueName: \"kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699530 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:08:36.699420 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls podName:325eac7d-52d4-497b-a929-eaee0c06746c nodeName:}" failed. No retries permitted until 2026-04-24 20:08:37.199397118 +0000 UTC m=+3710.523641279 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls") pod "isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c") : secret "isvc-sklearn-s3-tls-global-pass-predictor-serving-cert" not found Apr 24 20:08:36.699530 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699530 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699513 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.699928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.699903 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.700078 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.700056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.700139 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.700088 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.708069 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.708043 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sblr\" (UniqueName: \"kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:36.859784 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.859742 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.61:8643/healthz\": dial tcp 10.132.0.61:8643: connect: connection refused" Apr 24 20:08:36.864037 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:36.864007 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 24 20:08:37.073733 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.073651 2564 generic.go:358] "Generic (PLEG): container finished" podID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerID="d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837" exitCode=2 Apr 24 20:08:37.073733 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.073711 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerDied","Data":"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837"} Apr 24 20:08:37.205375 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.205325 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:37.207760 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.207731 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:37.468751 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.468709 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:37.591357 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:37.591315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:08:37.595255 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:08:37.595224 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod325eac7d_52d4_497b_a929_eaee0c06746c.slice/crio-cbe24f0614995e8dcc93c1a12ad3149726b87f3e7e09b78e3eb3cef2fe7bff1e WatchSource:0}: Error finding container cbe24f0614995e8dcc93c1a12ad3149726b87f3e7e09b78e3eb3cef2fe7bff1e: Status 404 returned error can't find the container with id cbe24f0614995e8dcc93c1a12ad3149726b87f3e7e09b78e3eb3cef2fe7bff1e Apr 24 20:08:38.077947 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:38.077907 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerStarted","Data":"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c"} Apr 24 20:08:38.077947 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:38.077947 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerStarted","Data":"cbe24f0614995e8dcc93c1a12ad3149726b87f3e7e09b78e3eb3cef2fe7bff1e"} Apr 24 20:08:39.082924 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:39.082828 2564 generic.go:358] "Generic (PLEG): container finished" podID="325eac7d-52d4-497b-a929-eaee0c06746c" containerID="7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c" exitCode=0 Apr 24 20:08:39.083359 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:39.082926 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerDied","Data":"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c"} Apr 24 20:08:40.088518 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.088475 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerStarted","Data":"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1"} Apr 24 20:08:40.088518 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.088514 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerStarted","Data":"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443"} Apr 24 20:08:40.088986 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.088669 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:40.109331 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.109269 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podStartSLOduration=4.109248588 podStartE2EDuration="4.109248588s" podCreationTimestamp="2026-04-24 20:08:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:08:40.107545975 +0000 UTC m=+3713.431790138" watchObservedRunningTime="2026-04-24 20:08:40.109248588 +0000 UTC m=+3713.433492752" Apr 24 20:08:40.780911 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.780887 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:08:40.836664 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.836620 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location\") pod \"97c946c0-69cd-41fb-af3f-231b6257a7af\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " Apr 24 20:08:40.836861 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.836685 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwq2w\" (UniqueName: \"kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w\") pod \"97c946c0-69cd-41fb-af3f-231b6257a7af\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " Apr 24 20:08:40.836861 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.836709 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"97c946c0-69cd-41fb-af3f-231b6257a7af\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " Apr 24 20:08:40.836861 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.836811 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") pod \"97c946c0-69cd-41fb-af3f-231b6257a7af\" (UID: \"97c946c0-69cd-41fb-af3f-231b6257a7af\") " Apr 24 20:08:40.837025 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.836940 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97c946c0-69cd-41fb-af3f-231b6257a7af" (UID: "97c946c0-69cd-41fb-af3f-231b6257a7af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:08:40.837086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.837025 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97c946c0-69cd-41fb-af3f-231b6257a7af-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:08:40.837086 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.837076 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "97c946c0-69cd-41fb-af3f-231b6257a7af" (UID: "97c946c0-69cd-41fb-af3f-231b6257a7af"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:08:40.838873 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.838839 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w" (OuterVolumeSpecName: "kube-api-access-dwq2w") pod "97c946c0-69cd-41fb-af3f-231b6257a7af" (UID: "97c946c0-69cd-41fb-af3f-231b6257a7af"). InnerVolumeSpecName "kube-api-access-dwq2w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:08:40.838990 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.838909 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "97c946c0-69cd-41fb-af3f-231b6257a7af" (UID: "97c946c0-69cd-41fb-af3f-231b6257a7af"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:08:40.937466 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.937363 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97c946c0-69cd-41fb-af3f-231b6257a7af-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:08:40.937466 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.937412 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwq2w\" (UniqueName: \"kubernetes.io/projected/97c946c0-69cd-41fb-af3f-231b6257a7af-kube-api-access-dwq2w\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:08:40.937466 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:40.937425 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97c946c0-69cd-41fb-af3f-231b6257a7af-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:08:41.093352 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093316 2564 generic.go:358] "Generic (PLEG): container finished" podID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerID="944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214" exitCode=0 Apr 24 20:08:41.093870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerDied","Data":"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214"} Apr 24 20:08:41.093870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093410 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" Apr 24 20:08:41.093870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093434 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d" event={"ID":"97c946c0-69cd-41fb-af3f-231b6257a7af","Type":"ContainerDied","Data":"278f127ae8e43ffc8dce98b50e69743632aaf6639d48cda939435915e52ecdef"} Apr 24 20:08:41.093870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093450 2564 scope.go:117] "RemoveContainer" containerID="d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837" Apr 24 20:08:41.094110 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.093969 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:41.095014 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.094987 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:08:41.102299 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.102183 2564 scope.go:117] "RemoveContainer" containerID="944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214" Apr 24 20:08:41.109363 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.109346 2564 scope.go:117] "RemoveContainer" containerID="89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc" Apr 24 20:08:41.115502 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.115477 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:08:41.117200 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.117178 2564 scope.go:117] "RemoveContainer" containerID="d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837" Apr 24 20:08:41.117529 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:08:41.117498 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837\": container with ID starting with d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837 not found: ID does not exist" containerID="d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837" Apr 24 20:08:41.117632 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.117541 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837"} err="failed to get container status \"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837\": rpc error: code = NotFound desc = could not find container \"d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837\": container with ID starting with d5fb8582400c0176a7fc57b6447a241b0006fa762d29779542e47eaa7ac56837 not found: ID does not exist" Apr 24 20:08:41.117632 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.117589 2564 scope.go:117] "RemoveContainer" containerID="944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214" Apr 24 20:08:41.117903 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:08:41.117876 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214\": container with ID starting with 944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214 not found: ID does not exist" containerID="944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214" Apr 24 20:08:41.117971 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.117910 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214"} err="failed to get container status \"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214\": rpc error: code = NotFound desc = could not find container \"944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214\": container with ID starting with 944b537bef7100e6b8ca310adce7bc2868eaa9a9cd5c296bbe72d68fae962214 not found: ID does not exist" Apr 24 20:08:41.117971 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.117932 2564 scope.go:117] "RemoveContainer" containerID="89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc" Apr 24 20:08:41.118227 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.118209 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f67dfb489-bn47d"] Apr 24 20:08:41.118281 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:08:41.118217 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc\": container with ID starting with 89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc not found: ID does not exist" containerID="89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc" Apr 24 20:08:41.118281 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.118241 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc"} err="failed to get container status \"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc\": rpc error: code = NotFound desc = could not find container \"89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc\": container with ID starting with 89e1ff43a12a4f0c8403bedb7dae59f32b7fc5fca052dd767b501117874fb8fc not found: ID does not exist" Apr 24 20:08:41.184739 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:41.184709 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" path="/var/lib/kubelet/pods/97c946c0-69cd-41fb-af3f-231b6257a7af/volumes" Apr 24 20:08:42.097700 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:42.097664 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:08:47.102448 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:47.102417 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:08:47.102986 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:47.102962 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:08:57.103500 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:08:57.103458 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:07.103436 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:07.103396 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:17.103489 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:17.103448 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:27.103240 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:27.103198 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:37.103604 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:37.103489 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:47.104472 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:47.104435 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:09:56.586416 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:56.586374 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:09:56.586995 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:56.586734 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" containerID="cri-o://399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443" gracePeriod=30 Apr 24 20:09:56.586995 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:56.586784 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kube-rbac-proxy" containerID="cri-o://a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1" gracePeriod=30 Apr 24 20:09:57.098312 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.098267 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.62:8643/healthz\": dial tcp 10.132.0.62:8643: connect: connection refused" Apr 24 20:09:57.103682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.103649 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 24 20:09:57.311277 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.311242 2564 generic.go:358] "Generic (PLEG): container finished" podID="325eac7d-52d4-497b-a929-eaee0c06746c" containerID="a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1" exitCode=2 Apr 24 20:09:57.311443 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.311295 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerDied","Data":"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1"} Apr 24 20:09:57.665372 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665331 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665781 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665799 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665809 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kube-rbac-proxy" Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665818 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kube-rbac-proxy" Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665831 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="storage-initializer" Apr 24 20:09:57.665896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665841 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="storage-initializer" Apr 24 20:09:57.666214 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665918 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kube-rbac-proxy" Apr 24 20:09:57.666214 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.665932 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="97c946c0-69cd-41fb-af3f-231b6257a7af" containerName="kserve-container" Apr 24 20:09:57.668967 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.668946 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.671322 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.671299 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 24 20:09:57.671431 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.671390 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 24 20:09:57.677661 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.677637 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:09:57.724729 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.724693 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4xhx\" (UniqueName: \"kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.724906 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.724744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.724906 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.724782 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.724906 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.724808 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825153 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825120 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4xhx\" (UniqueName: \"kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825331 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825166 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825331 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825463 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825346 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825463 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:09:57.825409 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 20:09:57.825537 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:09:57.825479 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls podName:911dc47b-30f8-42ca-a4cf-9f09c7db5f85 nodeName:}" failed. No retries permitted until 2026-04-24 20:09:58.325460654 +0000 UTC m=+3791.649704795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls") pod "isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" (UID: "911dc47b-30f8-42ca-a4cf-9f09c7db5f85") : secret "isvc-sklearn-s3-tls-global-fail-predictor-serving-cert" not found Apr 24 20:09:57.825791 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825771 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.825831 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.825806 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:57.834008 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:57.833981 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4xhx\" (UniqueName: \"kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:58.328527 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:58.328491 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:58.330936 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:58.330916 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:58.579037 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:58.578934 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:09:58.707960 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:58.707934 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:09:58.710570 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:09:58.710520 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod911dc47b_30f8_42ca_a4cf_9f09c7db5f85.slice/crio-a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323 WatchSource:0}: Error finding container a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323: Status 404 returned error can't find the container with id a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323 Apr 24 20:09:59.318009 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:59.317966 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerStarted","Data":"7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663"} Apr 24 20:09:59.318009 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:09:59.318010 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerStarted","Data":"a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323"} Apr 24 20:10:00.929227 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.929203 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:10:00.947378 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947349 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert\") pod \"325eac7d-52d4-497b-a929-eaee0c06746c\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " Apr 24 20:10:00.947540 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947427 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location\") pod \"325eac7d-52d4-497b-a929-eaee0c06746c\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " Apr 24 20:10:00.947540 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947462 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sblr\" (UniqueName: \"kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr\") pod \"325eac7d-52d4-497b-a929-eaee0c06746c\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " Apr 24 20:10:00.947540 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947499 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") pod \"325eac7d-52d4-497b-a929-eaee0c06746c\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " Apr 24 20:10:00.947540 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947532 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"325eac7d-52d4-497b-a929-eaee0c06746c\" (UID: \"325eac7d-52d4-497b-a929-eaee0c06746c\") " Apr 24 20:10:00.947874 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947797 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "325eac7d-52d4-497b-a929-eaee0c06746c" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:10:00.947874 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947859 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "325eac7d-52d4-497b-a929-eaee0c06746c" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:10:00.948008 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.947950 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "325eac7d-52d4-497b-a929-eaee0c06746c" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:10:00.950079 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.950050 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "325eac7d-52d4-497b-a929-eaee0c06746c" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:10:00.950201 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:00.950159 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr" (OuterVolumeSpecName: "kube-api-access-2sblr") pod "325eac7d-52d4-497b-a929-eaee0c06746c" (UID: "325eac7d-52d4-497b-a929-eaee0c06746c"). InnerVolumeSpecName "kube-api-access-2sblr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:10:01.048721 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.048611 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/325eac7d-52d4-497b-a929-eaee0c06746c-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:01.048721 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.048658 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:01.048721 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.048671 2564 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/325eac7d-52d4-497b-a929-eaee0c06746c-cabundle-cert\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:01.048721 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.048682 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/325eac7d-52d4-497b-a929-eaee0c06746c-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:01.048721 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.048692 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2sblr\" (UniqueName: \"kubernetes.io/projected/325eac7d-52d4-497b-a929-eaee0c06746c-kube-api-access-2sblr\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:01.325514 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.325414 2564 generic.go:358] "Generic (PLEG): container finished" podID="325eac7d-52d4-497b-a929-eaee0c06746c" containerID="399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443" exitCode=0 Apr 24 20:10:01.325514 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.325492 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerDied","Data":"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443"} Apr 24 20:10:01.325514 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.325503 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" Apr 24 20:10:01.325812 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.325528 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24" event={"ID":"325eac7d-52d4-497b-a929-eaee0c06746c","Type":"ContainerDied","Data":"cbe24f0614995e8dcc93c1a12ad3149726b87f3e7e09b78e3eb3cef2fe7bff1e"} Apr 24 20:10:01.325812 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.325544 2564 scope.go:117] "RemoveContainer" containerID="a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1" Apr 24 20:10:01.333374 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.333355 2564 scope.go:117] "RemoveContainer" containerID="399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443" Apr 24 20:10:01.340578 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.340542 2564 scope.go:117] "RemoveContainer" containerID="7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c" Apr 24 20:10:01.344728 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.344705 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:10:01.348561 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.348533 2564 scope.go:117] "RemoveContainer" containerID="a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1" Apr 24 20:10:01.348673 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.348654 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-64b9f98b85-5zz24"] Apr 24 20:10:01.348881 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:01.348854 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1\": container with ID starting with a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1 not found: ID does not exist" containerID="a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1" Apr 24 20:10:01.348928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.348889 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1"} err="failed to get container status \"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1\": rpc error: code = NotFound desc = could not find container \"a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1\": container with ID starting with a84dcd4e718d2b4e473d65c1d8be012175b9d34b5f305fa0529af182407709b1 not found: ID does not exist" Apr 24 20:10:01.348928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.348910 2564 scope.go:117] "RemoveContainer" containerID="399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443" Apr 24 20:10:01.349177 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:01.349156 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443\": container with ID starting with 399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443 not found: ID does not exist" containerID="399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443" Apr 24 20:10:01.349280 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.349180 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443"} err="failed to get container status \"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443\": rpc error: code = NotFound desc = could not find container \"399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443\": container with ID starting with 399ee594a9f32f6478fd80915994b7d87c849dfc7095ac1d7994ecd38ed1b443 not found: ID does not exist" Apr 24 20:10:01.349280 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.349200 2564 scope.go:117] "RemoveContainer" containerID="7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c" Apr 24 20:10:01.349398 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:01.349383 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c\": container with ID starting with 7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c not found: ID does not exist" containerID="7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c" Apr 24 20:10:01.349436 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:01.349401 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c"} err="failed to get container status \"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c\": rpc error: code = NotFound desc = could not find container \"7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c\": container with ID starting with 7d99999fad8023341fc41bb99a379d2e35798320c96763f2bdd0ed663621a19c not found: ID does not exist" Apr 24 20:10:02.330577 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:02.330535 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/0.log" Apr 24 20:10:02.330961 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:02.330587 2564 generic.go:358] "Generic (PLEG): container finished" podID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerID="7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663" exitCode=1 Apr 24 20:10:02.330961 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:02.330664 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerDied","Data":"7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663"} Apr 24 20:10:03.184881 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:03.184843 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" path="/var/lib/kubelet/pods/325eac7d-52d4-497b-a929-eaee0c06746c/volumes" Apr 24 20:10:03.335420 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:03.335394 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/0.log" Apr 24 20:10:03.335848 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:03.335465 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerStarted","Data":"6a9b5efd48ba1388dd96d01c0cd9351ad0747d870f69977f4ec8ced61d9e3f52"} Apr 24 20:10:07.347991 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.347959 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/1.log" Apr 24 20:10:07.348424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.348272 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/0.log" Apr 24 20:10:07.348424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.348304 2564 generic.go:358] "Generic (PLEG): container finished" podID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerID="6a9b5efd48ba1388dd96d01c0cd9351ad0747d870f69977f4ec8ced61d9e3f52" exitCode=1 Apr 24 20:10:07.348424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.348358 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerDied","Data":"6a9b5efd48ba1388dd96d01c0cd9351ad0747d870f69977f4ec8ced61d9e3f52"} Apr 24 20:10:07.348424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.348389 2564 scope.go:117] "RemoveContainer" containerID="7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663" Apr 24 20:10:07.348866 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.348837 2564 scope.go:117] "RemoveContainer" containerID="7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663" Apr 24 20:10:07.358814 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:07.358779 2564 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_kserve-ci-e2e-test_911dc47b-30f8-42ca-a4cf-9f09c7db5f85_0 in pod sandbox a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323 from index: no such id: '7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663'" containerID="7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663" Apr 24 20:10:07.358894 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:07.358831 2564 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_kserve-ci-e2e-test_911dc47b-30f8-42ca-a4cf-9f09c7db5f85_0 in pod sandbox a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323 from index: no such id: '7767f036476d186686a59cbcf11610f84bbcb9e6bc9cabf3bdfa935d1f5ae663'; Skipping pod \"isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_kserve-ci-e2e-test(911dc47b-30f8-42ca-a4cf-9f09c7db5f85)\"" logger="UnhandledError" Apr 24 20:10:07.360398 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:10:07.360377 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_kserve-ci-e2e-test(911dc47b-30f8-42ca-a4cf-9f09c7db5f85)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" Apr 24 20:10:07.648200 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:07.648111 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:10:08.352161 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.352134 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/1.log" Apr 24 20:10:08.477530 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.477508 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/1.log" Apr 24 20:10:08.477684 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.477589 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:10:08.503745 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.503714 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " Apr 24 20:10:08.503913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.503780 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4xhx\" (UniqueName: \"kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx\") pod \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " Apr 24 20:10:08.503913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.503797 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location\") pod \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " Apr 24 20:10:08.503913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.503819 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") pod \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\" (UID: \"911dc47b-30f8-42ca-a4cf-9f09c7db5f85\") " Apr 24 20:10:08.504135 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.504111 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "911dc47b-30f8-42ca-a4cf-9f09c7db5f85" (UID: "911dc47b-30f8-42ca-a4cf-9f09c7db5f85"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:10:08.504183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.504133 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "911dc47b-30f8-42ca-a4cf-9f09c7db5f85" (UID: "911dc47b-30f8-42ca-a4cf-9f09c7db5f85"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:10:08.505916 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.505893 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx" (OuterVolumeSpecName: "kube-api-access-p4xhx") pod "911dc47b-30f8-42ca-a4cf-9f09c7db5f85" (UID: "911dc47b-30f8-42ca-a4cf-9f09c7db5f85"). InnerVolumeSpecName "kube-api-access-p4xhx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:10:08.505916 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.505907 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "911dc47b-30f8-42ca-a4cf-9f09c7db5f85" (UID: "911dc47b-30f8-42ca-a4cf-9f09c7db5f85"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:10:08.604847 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.604750 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:08.604847 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.604790 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p4xhx\" (UniqueName: \"kubernetes.io/projected/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kube-api-access-p4xhx\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:08.604847 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.604805 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:08.604847 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.604818 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/911dc47b-30f8-42ca-a4cf-9f09c7db5f85-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:10:08.721603 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721543 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:10:08.721873 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721860 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721874 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721885 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="storage-initializer" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721891 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="storage-initializer" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721907 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kube-rbac-proxy" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721913 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kube-rbac-proxy" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721923 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" Apr 24 20:10:08.721928 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721928 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721971 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721979 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kserve-container" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721986 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.721993 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="325eac7d-52d4-497b-a929-eaee0c06746c" containerName="kube-rbac-proxy" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.722036 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.722137 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.722041 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" containerName="storage-initializer" Apr 24 20:10:08.726341 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.726321 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.728722 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.728698 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 24 20:10:08.729053 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.729031 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 24 20:10:08.729314 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.729294 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 20:10:08.736456 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.736433 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:10:08.805856 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.805814 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6bcw\" (UniqueName: \"kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.805856 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.805862 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.806071 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.805915 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.806071 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.805953 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.806071 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.806013 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906424 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906401 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906731 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906440 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j6bcw\" (UniqueName: \"kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906731 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906467 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.906923 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.906890 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.907145 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.907128 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.907200 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.907136 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.908952 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.908931 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:08.913900 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:08.913878 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6bcw\" (UniqueName: \"kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:09.037955 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.037901 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:09.158739 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.158621 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:10:09.161040 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:10:09.161013 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda493ffe_5fb9_46b3_930d_cd3f47a3c5a1.slice/crio-e9b29545d680121ed6ba596743568f362ab5e81ee49b41173744610b7eb9e7e3 WatchSource:0}: Error finding container e9b29545d680121ed6ba596743568f362ab5e81ee49b41173744610b7eb9e7e3: Status 404 returned error can't find the container with id e9b29545d680121ed6ba596743568f362ab5e81ee49b41173744610b7eb9e7e3 Apr 24 20:10:09.356842 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.356813 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw_911dc47b-30f8-42ca-a4cf-9f09c7db5f85/storage-initializer/1.log" Apr 24 20:10:09.357318 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.356909 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" event={"ID":"911dc47b-30f8-42ca-a4cf-9f09c7db5f85","Type":"ContainerDied","Data":"a6999e61571c7186b6b9d5c1f6a28c17a170ed67de51d43c7039e7b2410a3323"} Apr 24 20:10:09.357318 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.356947 2564 scope.go:117] "RemoveContainer" containerID="6a9b5efd48ba1388dd96d01c0cd9351ad0747d870f69977f4ec8ced61d9e3f52" Apr 24 20:10:09.357318 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.356964 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw" Apr 24 20:10:09.358442 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.358418 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerStarted","Data":"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c"} Apr 24 20:10:09.358584 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.358453 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerStarted","Data":"e9b29545d680121ed6ba596743568f362ab5e81ee49b41173744610b7eb9e7e3"} Apr 24 20:10:09.389380 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.389342 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:10:09.391169 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:09.391138 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-8676d75bb7-nz5tw"] Apr 24 20:10:10.363930 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:10.363897 2564 generic.go:358] "Generic (PLEG): container finished" podID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerID="c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c" exitCode=0 Apr 24 20:10:10.364328 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:10.363963 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerDied","Data":"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c"} Apr 24 20:10:11.184366 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.184330 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="911dc47b-30f8-42ca-a4cf-9f09c7db5f85" path="/var/lib/kubelet/pods/911dc47b-30f8-42ca-a4cf-9f09c7db5f85/volumes" Apr 24 20:10:11.368979 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.368945 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerStarted","Data":"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e"} Apr 24 20:10:11.368979 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.368981 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerStarted","Data":"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957"} Apr 24 20:10:11.369425 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.369156 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:11.369425 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.369292 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:11.370459 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.370435 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:11.388018 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:11.387966 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podStartSLOduration=3.3879481350000002 podStartE2EDuration="3.387948135s" podCreationTimestamp="2026-04-24 20:10:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:10:11.387338614 +0000 UTC m=+3804.711582777" watchObservedRunningTime="2026-04-24 20:10:11.387948135 +0000 UTC m=+3804.712192305" Apr 24 20:10:12.372468 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:12.372431 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:17.376486 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:17.376454 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:10:17.377090 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:17.377056 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:27.377602 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:27.377538 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:37.377405 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:37.377364 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:47.377431 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:47.377383 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:10:57.377095 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:10:57.377046 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:11:07.377959 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:07.377862 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 24 20:11:17.377717 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:17.377682 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:11:18.764531 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:18.764497 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:11:18.764981 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:18.764855 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" containerID="cri-o://bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957" gracePeriod=30 Apr 24 20:11:18.764981 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:18.764887 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kube-rbac-proxy" containerID="cri-o://1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e" gracePeriod=30 Apr 24 20:11:19.559379 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.559344 2564 generic.go:358] "Generic (PLEG): container finished" podID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerID="1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e" exitCode=2 Apr 24 20:11:19.559585 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.559389 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerDied","Data":"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e"} Apr 24 20:11:19.823001 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.822915 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:19.826237 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.826217 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.828389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.828361 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 24 20:11:19.828533 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.828490 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 24 20:11:19.834527 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.834503 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:19.862962 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.862924 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqnd8\" (UniqueName: \"kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.862962 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.862961 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.863178 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.863002 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.863178 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.863031 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.963940 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.963884 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.964090 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.963963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqnd8\" (UniqueName: \"kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.964090 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.963983 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.964090 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.964021 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.964090 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:19.964044 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 20:11:19.964316 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:19.964137 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls podName:955dbd6f-8a09-4362-9d7a-0590e2203c92 nodeName:}" failed. No retries permitted until 2026-04-24 20:11:20.46411641 +0000 UTC m=+3873.788360567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" (UID: "955dbd6f-8a09-4362-9d7a-0590e2203c92") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 24 20:11:19.964510 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.964488 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.964750 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.964733 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:19.972474 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:19.972451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqnd8\" (UniqueName: \"kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:20.467039 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:20.466992 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:20.469387 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:20.469359 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:20.737624 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:20.737506 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:20.864245 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:20.864119 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:20.866808 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:11:20.866777 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod955dbd6f_8a09_4362_9d7a_0590e2203c92.slice/crio-216f6a332bde19b6adc14a03af5ea4efe3108bb0b3d8c182b8ee6fbb69bc8579 WatchSource:0}: Error finding container 216f6a332bde19b6adc14a03af5ea4efe3108bb0b3d8c182b8ee6fbb69bc8579: Status 404 returned error can't find the container with id 216f6a332bde19b6adc14a03af5ea4efe3108bb0b3d8c182b8ee6fbb69bc8579 Apr 24 20:11:20.868699 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:20.868677 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 20:11:21.567252 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:21.567210 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerStarted","Data":"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302"} Apr 24 20:11:21.567252 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:21.567253 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerStarted","Data":"216f6a332bde19b6adc14a03af5ea4efe3108bb0b3d8c182b8ee6fbb69bc8579"} Apr 24 20:11:22.372972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:22.372934 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.64:8643/healthz\": dial tcp 10.132.0.64:8643: connect: connection refused" Apr 24 20:11:23.109894 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.109866 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:11:23.186706 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.186627 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j6bcw\" (UniqueName: \"kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw\") pod \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " Apr 24 20:11:23.186706 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.186667 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " Apr 24 20:11:23.186929 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.186706 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls\") pod \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " Apr 24 20:11:23.186929 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.186735 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert\") pod \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " Apr 24 20:11:23.186929 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.186776 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location\") pod \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\" (UID: \"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1\") " Apr 24 20:11:23.187149 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.187127 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" (UID: "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:11:23.187210 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.187140 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" (UID: "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:11:23.187210 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.187148 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" (UID: "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:11:23.188896 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.188878 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" (UID: "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:11:23.188964 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.188937 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw" (OuterVolumeSpecName: "kube-api-access-j6bcw") pod "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" (UID: "da493ffe-5fb9-46b3-930d-cd3f47a3c5a1"). InnerVolumeSpecName "kube-api-access-j6bcw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:11:23.287636 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.287592 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:23.287636 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.287629 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:23.287636 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.287640 2564 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-cabundle-cert\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:23.287877 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.287649 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:23.287877 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.287661 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j6bcw\" (UniqueName: \"kubernetes.io/projected/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1-kube-api-access-j6bcw\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:23.574574 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.574515 2564 generic.go:358] "Generic (PLEG): container finished" podID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerID="bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957" exitCode=0 Apr 24 20:11:23.575018 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.574597 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerDied","Data":"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957"} Apr 24 20:11:23.575018 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.574630 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" Apr 24 20:11:23.575018 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.574640 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t" event={"ID":"da493ffe-5fb9-46b3-930d-cd3f47a3c5a1","Type":"ContainerDied","Data":"e9b29545d680121ed6ba596743568f362ab5e81ee49b41173744610b7eb9e7e3"} Apr 24 20:11:23.575018 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.574657 2564 scope.go:117] "RemoveContainer" containerID="1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e" Apr 24 20:11:23.585848 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.585822 2564 scope.go:117] "RemoveContainer" containerID="bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957" Apr 24 20:11:23.593389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.593367 2564 scope.go:117] "RemoveContainer" containerID="c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c" Apr 24 20:11:23.598174 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.598145 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:11:23.601646 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.601618 2564 scope.go:117] "RemoveContainer" containerID="1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e" Apr 24 20:11:23.602039 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:23.602015 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e\": container with ID starting with 1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e not found: ID does not exist" containerID="1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e" Apr 24 20:11:23.602129 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.602054 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e"} err="failed to get container status \"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e\": rpc error: code = NotFound desc = could not find container \"1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e\": container with ID starting with 1746ccc467482e029be4a38328730d72b2aba3f997e8464a85c1d2ca9ec2755e not found: ID does not exist" Apr 24 20:11:23.602129 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.602075 2564 scope.go:117] "RemoveContainer" containerID="bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957" Apr 24 20:11:23.602491 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:23.602473 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957\": container with ID starting with bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957 not found: ID does not exist" containerID="bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957" Apr 24 20:11:23.602491 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.602494 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957"} err="failed to get container status \"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957\": rpc error: code = NotFound desc = could not find container \"bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957\": container with ID starting with bf5a5fd5922cdb389ed2c0ba793d3722b00f8a0751963ff6a347bcbc46f4a957 not found: ID does not exist" Apr 24 20:11:23.602723 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.602510 2564 scope.go:117] "RemoveContainer" containerID="c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c" Apr 24 20:11:23.602834 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:23.602816 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c\": container with ID starting with c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c not found: ID does not exist" containerID="c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c" Apr 24 20:11:23.602891 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.602839 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c"} err="failed to get container status \"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c\": rpc error: code = NotFound desc = could not find container \"c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c\": container with ID starting with c73f5bcb6cc0b8f95e1178733df5e4d33f8c1605c00a92b3f965918211a3af3c not found: ID does not exist" Apr 24 20:11:23.603340 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:23.603318 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-864bf645df-stx5t"] Apr 24 20:11:25.184563 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:25.184518 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" path="/var/lib/kubelet/pods/da493ffe-5fb9-46b3-930d-cd3f47a3c5a1/volumes" Apr 24 20:11:26.585175 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:26.585145 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/0.log" Apr 24 20:11:26.585573 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:26.585183 2564 generic.go:358] "Generic (PLEG): container finished" podID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerID="668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302" exitCode=1 Apr 24 20:11:26.585573 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:26.585263 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerDied","Data":"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302"} Apr 24 20:11:27.590229 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:27.590202 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/0.log" Apr 24 20:11:27.590728 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:27.590283 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerStarted","Data":"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c"} Apr 24 20:11:29.812662 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:29.812623 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:29.813147 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:29.812948 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" containerID="cri-o://dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c" gracePeriod=30 Apr 24 20:11:30.880695 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.880662 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.880964 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.880976 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.880994 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="storage-initializer" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.881000 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="storage-initializer" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.881007 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kube-rbac-proxy" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.881012 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kube-rbac-proxy" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.881060 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kserve-container" Apr 24 20:11:30.881085 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.881069 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="da493ffe-5fb9-46b3-930d-cd3f47a3c5a1" containerName="kube-rbac-proxy" Apr 24 20:11:30.884166 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.884146 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:30.886335 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.886313 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 24 20:11:30.886452 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.886405 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 24 20:11:30.886533 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.886504 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 24 20:11:30.892395 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.892371 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:11:30.944269 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.944223 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:30.944486 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.944302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:30.944486 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.944341 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:30.944486 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.944366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:30.944486 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:30.944420 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bff9q\" (UniqueName: \"kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.045595 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.045537 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.045805 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.045620 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.045805 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.045650 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.045805 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:31.045792 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 24 20:11:31.045994 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.045795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bff9q\" (UniqueName: \"kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.045994 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:31.045894 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls podName:9930dc6f-29d4-4fab-976b-06296607144d nodeName:}" failed. No retries permitted until 2026-04-24 20:11:31.545871436 +0000 UTC m=+3884.870115592 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls") pod "isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" (UID: "9930dc6f-29d4-4fab-976b-06296607144d") : secret "isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert" not found Apr 24 20:11:31.045994 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.045933 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.046160 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.046002 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.046288 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.046265 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.046596 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.046546 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.056905 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.056886 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bff9q\" (UniqueName: \"kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.550248 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.550204 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.552682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.552663 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.795451 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.795412 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:31.919078 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:31.919053 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:11:31.921600 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:11:31.921568 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9930dc6f_29d4_4fab_976b_06296607144d.slice/crio-2d98d05a7e33984781924ea98ae1e3025b6f3087da31217c2e636f65d50c455e WatchSource:0}: Error finding container 2d98d05a7e33984781924ea98ae1e3025b6f3087da31217c2e636f65d50c455e: Status 404 returned error can't find the container with id 2d98d05a7e33984781924ea98ae1e3025b6f3087da31217c2e636f65d50c455e Apr 24 20:11:32.606283 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:32.606235 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerStarted","Data":"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f"} Apr 24 20:11:32.606283 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:32.606288 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerStarted","Data":"2d98d05a7e33984781924ea98ae1e3025b6f3087da31217c2e636f65d50c455e"} Apr 24 20:11:33.358992 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.358970 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/1.log" Apr 24 20:11:33.359328 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.359304 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/0.log" Apr 24 20:11:33.359377 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.359368 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:33.466988 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.466950 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") pod \"955dbd6f-8a09-4362-9d7a-0590e2203c92\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " Apr 24 20:11:33.467204 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.467034 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location\") pod \"955dbd6f-8a09-4362-9d7a-0590e2203c92\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " Apr 24 20:11:33.467204 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.467094 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqnd8\" (UniqueName: \"kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8\") pod \"955dbd6f-8a09-4362-9d7a-0590e2203c92\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " Apr 24 20:11:33.467204 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.467118 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"955dbd6f-8a09-4362-9d7a-0590e2203c92\" (UID: \"955dbd6f-8a09-4362-9d7a-0590e2203c92\") " Apr 24 20:11:33.467367 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.467339 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "955dbd6f-8a09-4362-9d7a-0590e2203c92" (UID: "955dbd6f-8a09-4362-9d7a-0590e2203c92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:11:33.467452 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.467435 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "955dbd6f-8a09-4362-9d7a-0590e2203c92" (UID: "955dbd6f-8a09-4362-9d7a-0590e2203c92"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:11:33.469135 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.469109 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "955dbd6f-8a09-4362-9d7a-0590e2203c92" (UID: "955dbd6f-8a09-4362-9d7a-0590e2203c92"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:11:33.469186 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.469123 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8" (OuterVolumeSpecName: "kube-api-access-xqnd8") pod "955dbd6f-8a09-4362-9d7a-0590e2203c92" (UID: "955dbd6f-8a09-4362-9d7a-0590e2203c92"). InnerVolumeSpecName "kube-api-access-xqnd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:11:33.567804 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.567714 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/955dbd6f-8a09-4362-9d7a-0590e2203c92-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:33.567804 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.567747 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqnd8\" (UniqueName: \"kubernetes.io/projected/955dbd6f-8a09-4362-9d7a-0590e2203c92-kube-api-access-xqnd8\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:33.567804 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.567759 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/955dbd6f-8a09-4362-9d7a-0590e2203c92-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:33.567804 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.567770 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/955dbd6f-8a09-4362-9d7a-0590e2203c92-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:11:33.610503 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610476 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/1.log" Apr 24 20:11:33.610834 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610817 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk_955dbd6f-8a09-4362-9d7a-0590e2203c92/storage-initializer/0.log" Apr 24 20:11:33.610883 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610855 2564 generic.go:358] "Generic (PLEG): container finished" podID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerID="dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c" exitCode=1 Apr 24 20:11:33.610955 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610938 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerDied","Data":"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c"} Apr 24 20:11:33.610997 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610975 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" event={"ID":"955dbd6f-8a09-4362-9d7a-0590e2203c92","Type":"ContainerDied","Data":"216f6a332bde19b6adc14a03af5ea4efe3108bb0b3d8c182b8ee6fbb69bc8579"} Apr 24 20:11:33.610997 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610992 2564 scope.go:117] "RemoveContainer" containerID="dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c" Apr 24 20:11:33.611073 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.610943 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk" Apr 24 20:11:33.612259 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.612229 2564 generic.go:358] "Generic (PLEG): container finished" podID="9930dc6f-29d4-4fab-976b-06296607144d" containerID="e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f" exitCode=0 Apr 24 20:11:33.612365 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.612279 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerDied","Data":"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f"} Apr 24 20:11:33.619531 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.619496 2564 scope.go:117] "RemoveContainer" containerID="668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302" Apr 24 20:11:33.626783 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.626762 2564 scope.go:117] "RemoveContainer" containerID="dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c" Apr 24 20:11:33.627041 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:33.627020 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c\": container with ID starting with dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c not found: ID does not exist" containerID="dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c" Apr 24 20:11:33.627136 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.627049 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c"} err="failed to get container status \"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c\": rpc error: code = NotFound desc = could not find container \"dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c\": container with ID starting with dc8be50e9799f6cffe4b380738c70c0c0a5199d8941866cf4c62637b29e07c7c not found: ID does not exist" Apr 24 20:11:33.627136 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.627066 2564 scope.go:117] "RemoveContainer" containerID="668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302" Apr 24 20:11:33.627279 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:11:33.627260 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302\": container with ID starting with 668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302 not found: ID does not exist" containerID="668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302" Apr 24 20:11:33.627339 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.627285 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302"} err="failed to get container status \"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302\": rpc error: code = NotFound desc = could not find container \"668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302\": container with ID starting with 668789af761892317b26e124510e65c48d56abd982b9ec22dab240365acaa302 not found: ID does not exist" Apr 24 20:11:33.722480 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.722451 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:33.726708 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:33.726680 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-5c8c5ff9c5-nhbwk"] Apr 24 20:11:34.617450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:34.617413 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerStarted","Data":"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee"} Apr 24 20:11:34.617450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:34.617452 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerStarted","Data":"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8"} Apr 24 20:11:34.617912 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:34.617688 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:34.636793 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:34.636735 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podStartSLOduration=4.63671877 podStartE2EDuration="4.63671877s" podCreationTimestamp="2026-04-24 20:11:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:11:34.635047492 +0000 UTC m=+3887.959291655" watchObservedRunningTime="2026-04-24 20:11:34.63671877 +0000 UTC m=+3887.960962996" Apr 24 20:11:35.184120 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:35.184086 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" path="/var/lib/kubelet/pods/955dbd6f-8a09-4362-9d7a-0590e2203c92/volumes" Apr 24 20:11:35.620533 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:35.620494 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:35.621760 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:35.621735 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:11:36.622901 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:36.622855 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:11:41.627168 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:41.627125 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:11:41.627774 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:41.627746 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:11:51.627678 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:11:51.627635 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:01.628012 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:01.627969 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:11.628279 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:11.628238 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:21.628596 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:21.628540 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:31.627798 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:31.627758 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:41.629243 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:41.629161 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:12:50.931131 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:50.931090 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:12:50.931665 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:50.931417 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" containerID="cri-o://94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8" gracePeriod=30 Apr 24 20:12:50.931665 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:50.931469 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kube-rbac-proxy" containerID="cri-o://7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee" gracePeriod=30 Apr 24 20:12:51.624026 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:51.623969 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.66:8643/healthz\": dial tcp 10.132.0.66:8643: connect: connection refused" Apr 24 20:12:51.628343 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:51.628309 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 24 20:12:51.839715 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:51.839681 2564 generic.go:358] "Generic (PLEG): container finished" podID="9930dc6f-29d4-4fab-976b-06296607144d" containerID="7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee" exitCode=2 Apr 24 20:12:51.839891 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:51.839737 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerDied","Data":"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee"} Apr 24 20:12:52.021248 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021212 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021629 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021650 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021664 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021670 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021718 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.021913 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.021807 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="955dbd6f-8a09-4362-9d7a-0590e2203c92" containerName="storage-initializer" Apr 24 20:12:52.024770 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.024745 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.027785 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.027763 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 24 20:12:52.027890 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.027806 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 24 20:12:52.042675 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.042644 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:12:52.078138 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.078097 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.078138 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.078141 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.078374 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.078224 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.078374 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.078250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcvck\" (UniqueName: \"kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.178854 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.178816 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.179058 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.178868 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.179058 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:12:52.178955 2564 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 20:12:52.179058 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.178978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dcvck\" (UniqueName: \"kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.179058 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:12:52.179006 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls podName:26d84bf3-f483-4365-9b03-e01701ab5c91 nodeName:}" failed. No retries permitted until 2026-04-24 20:12:52.678990313 +0000 UTC m=+3966.003234454 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" (UID: "26d84bf3-f483-4365-9b03-e01701ab5c91") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 24 20:12:52.179378 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.179081 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.179378 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.179330 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.179596 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.179580 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.189889 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.189865 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcvck\" (UniqueName: \"kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.682759 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.682718 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.685230 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.685202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:52.934493 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:52.934388 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:12:53.069474 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:53.069438 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:12:53.074202 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:12:53.074172 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d84bf3_f483_4365_9b03_e01701ab5c91.slice/crio-cf201ebdfd82cff87e86c817a6be40675c1bd454cd0dcf1fdc8b87073e14f6bc WatchSource:0}: Error finding container cf201ebdfd82cff87e86c817a6be40675c1bd454cd0dcf1fdc8b87073e14f6bc: Status 404 returned error can't find the container with id cf201ebdfd82cff87e86c817a6be40675c1bd454cd0dcf1fdc8b87073e14f6bc Apr 24 20:12:53.847453 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:53.847416 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerStarted","Data":"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa"} Apr 24 20:12:53.847453 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:53.847454 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerStarted","Data":"cf201ebdfd82cff87e86c817a6be40675c1bd454cd0dcf1fdc8b87073e14f6bc"} Apr 24 20:12:55.372105 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.372082 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:12:55.402222 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402189 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"9930dc6f-29d4-4fab-976b-06296607144d\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " Apr 24 20:12:55.402361 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402246 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location\") pod \"9930dc6f-29d4-4fab-976b-06296607144d\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " Apr 24 20:12:55.402361 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402297 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") pod \"9930dc6f-29d4-4fab-976b-06296607144d\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " Apr 24 20:12:55.402474 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402387 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert\") pod \"9930dc6f-29d4-4fab-976b-06296607144d\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " Apr 24 20:12:55.402531 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402499 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bff9q\" (UniqueName: \"kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q\") pod \"9930dc6f-29d4-4fab-976b-06296607144d\" (UID: \"9930dc6f-29d4-4fab-976b-06296607144d\") " Apr 24 20:12:55.402723 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402658 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9930dc6f-29d4-4fab-976b-06296607144d" (UID: "9930dc6f-29d4-4fab-976b-06296607144d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:12:55.402723 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402676 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "9930dc6f-29d4-4fab-976b-06296607144d" (UID: "9930dc6f-29d4-4fab-976b-06296607144d"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:12:55.402870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402744 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9930dc6f-29d4-4fab-976b-06296607144d" (UID: "9930dc6f-29d4-4fab-976b-06296607144d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:12:55.402870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402775 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:12:55.402870 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.402796 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9930dc6f-29d4-4fab-976b-06296607144d-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:12:55.404834 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.404798 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9930dc6f-29d4-4fab-976b-06296607144d" (UID: "9930dc6f-29d4-4fab-976b-06296607144d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:12:55.405095 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.405060 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q" (OuterVolumeSpecName: "kube-api-access-bff9q") pod "9930dc6f-29d4-4fab-976b-06296607144d" (UID: "9930dc6f-29d4-4fab-976b-06296607144d"). InnerVolumeSpecName "kube-api-access-bff9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:12:55.503728 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.503636 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9930dc6f-29d4-4fab-976b-06296607144d-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:12:55.503728 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.503666 2564 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9930dc6f-29d4-4fab-976b-06296607144d-cabundle-cert\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:12:55.503728 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.503677 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bff9q\" (UniqueName: \"kubernetes.io/projected/9930dc6f-29d4-4fab-976b-06296607144d-kube-api-access-bff9q\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:12:55.855225 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.855134 2564 generic.go:358] "Generic (PLEG): container finished" podID="9930dc6f-29d4-4fab-976b-06296607144d" containerID="94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8" exitCode=0 Apr 24 20:12:55.855225 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.855210 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" Apr 24 20:12:55.855225 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.855216 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerDied","Data":"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8"} Apr 24 20:12:55.855451 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.855256 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn" event={"ID":"9930dc6f-29d4-4fab-976b-06296607144d","Type":"ContainerDied","Data":"2d98d05a7e33984781924ea98ae1e3025b6f3087da31217c2e636f65d50c455e"} Apr 24 20:12:55.855451 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.855271 2564 scope.go:117] "RemoveContainer" containerID="7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee" Apr 24 20:12:55.864148 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.864124 2564 scope.go:117] "RemoveContainer" containerID="94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8" Apr 24 20:12:55.872019 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.871996 2564 scope.go:117] "RemoveContainer" containerID="e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f" Apr 24 20:12:55.879825 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.879795 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:12:55.882681 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.882653 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-d6d5c84cb-xqszn"] Apr 24 20:12:55.885169 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.885153 2564 scope.go:117] "RemoveContainer" containerID="7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee" Apr 24 20:12:55.885476 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:12:55.885455 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee\": container with ID starting with 7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee not found: ID does not exist" containerID="7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee" Apr 24 20:12:55.885525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.885487 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee"} err="failed to get container status \"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee\": rpc error: code = NotFound desc = could not find container \"7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee\": container with ID starting with 7f9f3c6a5679498d4e70a6bc610d8a041d2a9d38f706c229a27449125a9ebaee not found: ID does not exist" Apr 24 20:12:55.885525 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.885509 2564 scope.go:117] "RemoveContainer" containerID="94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8" Apr 24 20:12:55.885787 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:12:55.885767 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8\": container with ID starting with 94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8 not found: ID does not exist" containerID="94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8" Apr 24 20:12:55.885842 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.885794 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8"} err="failed to get container status \"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8\": rpc error: code = NotFound desc = could not find container \"94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8\": container with ID starting with 94efe6d02a58e38f16ccbe1896dda97d3700c684b40dbec30628e5985dc89de8 not found: ID does not exist" Apr 24 20:12:55.885842 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.885811 2564 scope.go:117] "RemoveContainer" containerID="e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f" Apr 24 20:12:55.886052 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:12:55.886035 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f\": container with ID starting with e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f not found: ID does not exist" containerID="e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f" Apr 24 20:12:55.886094 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:55.886058 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f"} err="failed to get container status \"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f\": rpc error: code = NotFound desc = could not find container \"e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f\": container with ID starting with e8bb2ffb0e996a7208e4d852ab00c6ee120af19a18b01552d7089c74b925ce7f not found: ID does not exist" Apr 24 20:12:57.186565 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:57.186515 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9930dc6f-29d4-4fab-976b-06296607144d" path="/var/lib/kubelet/pods/9930dc6f-29d4-4fab-976b-06296607144d/volumes" Apr 24 20:12:58.865943 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:58.865915 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/0.log" Apr 24 20:12:58.866327 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:58.865953 2564 generic.go:358] "Generic (PLEG): container finished" podID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerID="2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa" exitCode=1 Apr 24 20:12:58.866327 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:58.865991 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerDied","Data":"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa"} Apr 24 20:12:59.869820 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:59.869796 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/0.log" Apr 24 20:12:59.870203 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:12:59.869857 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerStarted","Data":"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9"} Apr 24 20:13:01.987365 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:01.983762 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:13:01.987365 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:01.984265 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" containerID="cri-o://ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9" gracePeriod=30 Apr 24 20:13:04.435387 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.435362 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/1.log" Apr 24 20:13:04.435760 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.435743 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/0.log" Apr 24 20:13:04.435845 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.435834 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:13:04.468315 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468214 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location\") pod \"26d84bf3-f483-4365-9b03-e01701ab5c91\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " Apr 24 20:13:04.468315 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468261 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"26d84bf3-f483-4365-9b03-e01701ab5c91\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " Apr 24 20:13:04.468315 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468280 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") pod \"26d84bf3-f483-4365-9b03-e01701ab5c91\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " Apr 24 20:13:04.468315 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468299 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcvck\" (UniqueName: \"kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck\") pod \"26d84bf3-f483-4365-9b03-e01701ab5c91\" (UID: \"26d84bf3-f483-4365-9b03-e01701ab5c91\") " Apr 24 20:13:04.468690 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468569 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "26d84bf3-f483-4365-9b03-e01701ab5c91" (UID: "26d84bf3-f483-4365-9b03-e01701ab5c91"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 20:13:04.468743 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.468683 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "26d84bf3-f483-4365-9b03-e01701ab5c91" (UID: "26d84bf3-f483-4365-9b03-e01701ab5c91"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 20:13:04.470448 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.470425 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck" (OuterVolumeSpecName: "kube-api-access-dcvck") pod "26d84bf3-f483-4365-9b03-e01701ab5c91" (UID: "26d84bf3-f483-4365-9b03-e01701ab5c91"). InnerVolumeSpecName "kube-api-access-dcvck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 20:13:04.470536 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.470513 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "26d84bf3-f483-4365-9b03-e01701ab5c91" (UID: "26d84bf3-f483-4365-9b03-e01701ab5c91"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 20:13:04.569389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.569350 2564 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/26d84bf3-f483-4365-9b03-e01701ab5c91-kserve-provision-location\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:13:04.569389 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.569382 2564 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/26d84bf3-f483-4365-9b03-e01701ab5c91-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:13:04.569644 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.569398 2564 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/26d84bf3-f483-4365-9b03-e01701ab5c91-proxy-tls\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:13:04.569644 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.569411 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dcvck\" (UniqueName: \"kubernetes.io/projected/26d84bf3-f483-4365-9b03-e01701ab5c91-kube-api-access-dcvck\") on node \"ip-10-0-129-124.ec2.internal\" DevicePath \"\"" Apr 24 20:13:04.885786 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.885708 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/1.log" Apr 24 20:13:04.886075 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886057 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7_26d84bf3-f483-4365-9b03-e01701ab5c91/storage-initializer/0.log" Apr 24 20:13:04.886183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886098 2564 generic.go:358] "Generic (PLEG): container finished" podID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerID="ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9" exitCode=1 Apr 24 20:13:04.886183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886144 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerDied","Data":"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9"} Apr 24 20:13:04.886183 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886179 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" event={"ID":"26d84bf3-f483-4365-9b03-e01701ab5c91","Type":"ContainerDied","Data":"cf201ebdfd82cff87e86c817a6be40675c1bd454cd0dcf1fdc8b87073e14f6bc"} Apr 24 20:13:04.886330 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886190 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7" Apr 24 20:13:04.886330 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.886199 2564 scope.go:117] "RemoveContainer" containerID="ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9" Apr 24 20:13:04.894974 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.894825 2564 scope.go:117] "RemoveContainer" containerID="2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa" Apr 24 20:13:04.901924 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.901906 2564 scope.go:117] "RemoveContainer" containerID="ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9" Apr 24 20:13:04.902177 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:13:04.902158 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9\": container with ID starting with ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9 not found: ID does not exist" containerID="ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9" Apr 24 20:13:04.902228 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.902187 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9"} err="failed to get container status \"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9\": rpc error: code = NotFound desc = could not find container \"ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9\": container with ID starting with ce1f0782256e58ed4e3735f27fc70b372c5553a3dbf4cc4552d4ddc36d0e20a9 not found: ID does not exist" Apr 24 20:13:04.902228 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.902207 2564 scope.go:117] "RemoveContainer" containerID="2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa" Apr 24 20:13:04.902414 ip-10-0-129-124 kubenswrapper[2564]: E0424 20:13:04.902397 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa\": container with ID starting with 2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa not found: ID does not exist" containerID="2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa" Apr 24 20:13:04.902453 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.902418 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa"} err="failed to get container status \"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa\": rpc error: code = NotFound desc = could not find container \"2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa\": container with ID starting with 2489cc4d6c738eec2497fa0b6eff8c155d77190f0f608ad08e56110b3f2987fa not found: ID does not exist" Apr 24 20:13:04.921585 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.921526 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:13:04.925701 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:04.925668 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-675c74f7fd-6rdh7"] Apr 24 20:13:05.185172 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:05.185092 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" path="/var/lib/kubelet/pods/26d84bf3-f483-4365-9b03-e01701ab5c91/volumes" Apr 24 20:13:31.473335 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473292 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tkz2z/must-gather-n5f4l"] Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473681 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473700 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473715 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473723 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473741 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473751 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473762 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kube-rbac-proxy" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473770 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kube-rbac-proxy" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473853 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kube-rbac-proxy" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473866 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9930dc6f-29d4-4fab-976b-06296607144d" containerName="kserve-container" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473879 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473957 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.473972 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.473965 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.474652 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.474025 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="26d84bf3-f483-4365-9b03-e01701ab5c91" containerName="storage-initializer" Apr 24 20:13:31.478472 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.478451 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.480690 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.480672 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tkz2z\"/\"default-dockercfg-k94xx\"" Apr 24 20:13:31.480811 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.480715 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"kube-root-ca.crt\"" Apr 24 20:13:31.481694 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.481651 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tkz2z\"/\"openshift-service-ca.crt\"" Apr 24 20:13:31.484334 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.484312 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/must-gather-n5f4l"] Apr 24 20:13:31.573496 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.573457 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94vvc\" (UniqueName: \"kubernetes.io/projected/a2939983-0d15-425f-9c52-b0ba4a7e285b-kube-api-access-94vvc\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.573496 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.573496 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2939983-0d15-425f-9c52-b0ba4a7e285b-must-gather-output\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.674794 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.674755 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94vvc\" (UniqueName: \"kubernetes.io/projected/a2939983-0d15-425f-9c52-b0ba4a7e285b-kube-api-access-94vvc\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.674794 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.674798 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2939983-0d15-425f-9c52-b0ba4a7e285b-must-gather-output\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.675159 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.675133 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a2939983-0d15-425f-9c52-b0ba4a7e285b-must-gather-output\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.683091 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.683056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94vvc\" (UniqueName: \"kubernetes.io/projected/a2939983-0d15-425f-9c52-b0ba4a7e285b-kube-api-access-94vvc\") pod \"must-gather-n5f4l\" (UID: \"a2939983-0d15-425f-9c52-b0ba4a7e285b\") " pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.788123 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.788032 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" Apr 24 20:13:31.916904 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.916870 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/must-gather-n5f4l"] Apr 24 20:13:31.920361 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:13:31.920333 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2939983_0d15_425f_9c52_b0ba4a7e285b.slice/crio-6a88f1385a58631050da7bea998695fd83f046b545a1ad14ee87bd4a1893f41e WatchSource:0}: Error finding container 6a88f1385a58631050da7bea998695fd83f046b545a1ad14ee87bd4a1893f41e: Status 404 returned error can't find the container with id 6a88f1385a58631050da7bea998695fd83f046b545a1ad14ee87bd4a1893f41e Apr 24 20:13:31.962351 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:31.962313 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" event={"ID":"a2939983-0d15-425f-9c52-b0ba4a7e285b","Type":"ContainerStarted","Data":"6a88f1385a58631050da7bea998695fd83f046b545a1ad14ee87bd4a1893f41e"} Apr 24 20:13:33.969526 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:33.969491 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" event={"ID":"a2939983-0d15-425f-9c52-b0ba4a7e285b","Type":"ContainerStarted","Data":"90650cdcdf507016c8a54dcadbbe7cd545f9499bb4b150d10773f5f1d1d7bbdf"} Apr 24 20:13:33.969526 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:33.969531 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" event={"ID":"a2939983-0d15-425f-9c52-b0ba4a7e285b","Type":"ContainerStarted","Data":"d3ce8be9ecc9b42f217eedfca76f9f56925cadbedbe21926bf400328799bc096"} Apr 24 20:13:33.990974 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:33.990906 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tkz2z/must-gather-n5f4l" podStartSLOduration=1.826778085 podStartE2EDuration="2.990881703s" podCreationTimestamp="2026-04-24 20:13:31 +0000 UTC" firstStartedPulling="2026-04-24 20:13:31.922131921 +0000 UTC m=+4005.246376063" lastFinishedPulling="2026-04-24 20:13:33.086235525 +0000 UTC m=+4006.410479681" observedRunningTime="2026-04-24 20:13:33.989860154 +0000 UTC m=+4007.314104318" watchObservedRunningTime="2026-04-24 20:13:33.990881703 +0000 UTC m=+4007.315125867" Apr 24 20:13:35.245833 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:35.245801 2564 ???:1] "http: TLS handshake error from 10.0.131.214:48724: EOF" Apr 24 20:13:35.259325 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:35.259285 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-v6hxv_62836674-92b3-4b2c-a4c9-e6896f0ff8fa/global-pull-secret-syncer/0.log" Apr 24 20:13:35.489844 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:35.489781 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6wkx2_07894671-8fdc-4378-8129-c0529b896ce6/konnectivity-agent/0.log" Apr 24 20:13:35.557577 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:35.557429 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-124.ec2.internal_3af0b9266f4203b6c4070d0308bc061d/haproxy/0.log" Apr 24 20:13:39.309336 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:39.309301 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jbh4_493effb9-7158-4628-b742-23d621fdfb64/node-exporter/0.log" Apr 24 20:13:39.330265 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:39.330233 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jbh4_493effb9-7158-4628-b742-23d621fdfb64/kube-rbac-proxy/0.log" Apr 24 20:13:39.350882 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:39.350858 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7jbh4_493effb9-7158-4628-b742-23d621fdfb64/init-textfile/0.log" Apr 24 20:13:41.202029 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:41.202002 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-tfqzr_567b5df7-6cf3-459e-a1cc-68aa56346b42/networking-console-plugin/0.log" Apr 24 20:13:42.638348 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.638312 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z"] Apr 24 20:13:42.642940 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.642915 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.648532 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.648193 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z"] Apr 24 20:13:42.766702 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.766670 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq8pw\" (UniqueName: \"kubernetes.io/projected/07a16afc-e854-4370-96dd-d748b3385457-kube-api-access-rq8pw\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.766869 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.766718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-proc\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.766869 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.766821 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-lib-modules\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.766961 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.766869 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-sys\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.766961 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.766893 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-podres\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.867778 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867742 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-proc\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.867778 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867785 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-lib-modules\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867810 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-sys\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867832 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-podres\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq8pw\" (UniqueName: \"kubernetes.io/projected/07a16afc-e854-4370-96dd-d748b3385457-kube-api-access-rq8pw\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867880 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-proc\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867880 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-sys\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867937 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-lib-modules\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.868011 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.867950 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/07a16afc-e854-4370-96dd-d748b3385457-podres\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.878581 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.878536 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq8pw\" (UniqueName: \"kubernetes.io/projected/07a16afc-e854-4370-96dd-d748b3385457-kube-api-access-rq8pw\") pod \"perf-node-gather-daemonset-6kx9z\" (UID: \"07a16afc-e854-4370-96dd-d748b3385457\") " pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:42.956760 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:42.956721 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:43.106191 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:43.105360 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z"] Apr 24 20:13:43.112200 ip-10-0-129-124 kubenswrapper[2564]: W0424 20:13:43.112156 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod07a16afc_e854_4370_96dd_d748b3385457.slice/crio-d30404779c634ebcd83f0bdf6dae9a587887c2dd7b994ded3a1fe3bfe67ff9e4 WatchSource:0}: Error finding container d30404779c634ebcd83f0bdf6dae9a587887c2dd7b994ded3a1fe3bfe67ff9e4: Status 404 returned error can't find the container with id d30404779c634ebcd83f0bdf6dae9a587887c2dd7b994ded3a1fe3bfe67ff9e4 Apr 24 20:13:43.132582 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:43.132541 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hqq8t_72cbc52d-39db-4924-9a8f-438bf75f9a50/dns/0.log" Apr 24 20:13:43.154613 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:43.154576 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hqq8t_72cbc52d-39db-4924-9a8f-438bf75f9a50/kube-rbac-proxy/0.log" Apr 24 20:13:43.254373 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:43.254342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-j4w5m_6d09a5ce-72ac-4dea-826a-c4a0beb36ce3/dns-node-resolver/0.log" Apr 24 20:13:43.751820 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:43.751784 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nbt8b_e51af979-d2ff-49ba-8e4e-7620a2a4cd7e/node-ca/0.log" Apr 24 20:13:44.005651 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:44.005545 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" event={"ID":"07a16afc-e854-4370-96dd-d748b3385457","Type":"ContainerStarted","Data":"001eecba0e6aaa5e7387ca0d8d84838ae787d4b917f439b0367f5d99e780d579"} Apr 24 20:13:44.005651 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:44.005607 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:44.005651 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:44.005622 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" event={"ID":"07a16afc-e854-4370-96dd-d748b3385457","Type":"ContainerStarted","Data":"d30404779c634ebcd83f0bdf6dae9a587887c2dd7b994ded3a1fe3bfe67ff9e4"} Apr 24 20:13:44.023998 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:44.023943 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" podStartSLOduration=2.023927547 podStartE2EDuration="2.023927547s" podCreationTimestamp="2026-04-24 20:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 20:13:44.022915045 +0000 UTC m=+4017.347159223" watchObservedRunningTime="2026-04-24 20:13:44.023927547 +0000 UTC m=+4017.348171734" Apr 24 20:13:44.852270 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:44.852234 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-jzpqz_ab654b1c-fb83-4468-9595-2d19444f6f70/serve-healthcheck-canary/0.log" Apr 24 20:13:45.319897 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:45.319857 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jr97d_f52eb3dd-0025-4b92-8240-ccd8cabc06b4/kube-rbac-proxy/0.log" Apr 24 20:13:45.341451 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:45.341414 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jr97d_f52eb3dd-0025-4b92-8240-ccd8cabc06b4/exporter/0.log" Apr 24 20:13:45.363791 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:45.363758 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jr97d_f52eb3dd-0025-4b92-8240-ccd8cabc06b4/extractor/0.log" Apr 24 20:13:47.585326 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:47.585296 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-lw4fp_31c81dac-86cd-4bd5-95c9-6378736961c3/server/0.log" Apr 24 20:13:48.022590 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:48.022528 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-sjh7l_b8b105d2-a24a-49da-a5f4-94c41212b27c/manager/0.log" Apr 24 20:13:48.138944 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:48.138897 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-79f457656b-dnvmz_244773f8-4954-457f-b86b-6d39c169449e/seaweedfs/0.log" Apr 24 20:13:48.179503 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:48.179470 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-2h62f_f676d40b-e099-4abd-952c-a9a1e24357f2/seaweedfs-tls-custom/0.log" Apr 24 20:13:48.218483 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:48.218449 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-6qfcw_de019248-dfbd-4b16-8c85-d710c48b1922/seaweedfs-tls-serving/0.log" Apr 24 20:13:50.019482 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:50.019451 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tkz2z/perf-node-gather-daemonset-6kx9z" Apr 24 20:13:53.368450 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.368407 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-8svws_9a8daf37-7548-4dd2-ba26-c79a7de10480/kube-multus/0.log" Apr 24 20:13:53.553098 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.553065 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/kube-multus-additional-cni-plugins/0.log" Apr 24 20:13:53.575287 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.575252 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/egress-router-binary-copy/0.log" Apr 24 20:13:53.598427 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.598400 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/cni-plugins/0.log" Apr 24 20:13:53.621498 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.621410 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/bond-cni-plugin/0.log" Apr 24 20:13:53.644694 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.644661 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/routeoverride-cni/0.log" Apr 24 20:13:53.668174 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.668137 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/whereabouts-cni-bincopy/0.log" Apr 24 20:13:53.691720 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:53.691691 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-rb2v6_e6803a3f-1c1e-40c8-b605-ddd7da3d4d0d/whereabouts-cni/0.log" Apr 24 20:13:54.109793 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.109754 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p2bz2_9104b10b-fe94-4977-b556-addf9a7f232f/network-metrics-daemon/0.log" Apr 24 20:13:54.129491 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.129460 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-p2bz2_9104b10b-fe94-4977-b556-addf9a7f232f/kube-rbac-proxy/0.log" Apr 24 20:13:54.874252 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.874169 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/ovn-controller/0.log" Apr 24 20:13:54.928012 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.927976 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/ovn-acl-logging/0.log" Apr 24 20:13:54.952771 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.952738 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/kube-rbac-proxy-node/0.log" Apr 24 20:13:54.975063 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.975030 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 20:13:54.992190 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:54.992159 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/northd/0.log" Apr 24 20:13:55.012698 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:55.012669 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/nbdb/0.log" Apr 24 20:13:55.036682 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:55.036651 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/sbdb/0.log" Apr 24 20:13:55.235912 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:55.235879 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-d6t5k_ccfa5733-74cd-4083-833e-376f6fc796e2/ovnkube-controller/0.log" Apr 24 20:13:56.938267 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:56.938238 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-mdmw5_f7682155-10ef-40a7-9a0c-cef3315bdd30/network-check-target-container/0.log" Apr 24 20:13:57.853532 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:57.853499 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hc4w2_519a2b19-a52e-492e-b937-624360a7c8ad/iptables-alerter/0.log" Apr 24 20:13:58.501742 ip-10-0-129-124 kubenswrapper[2564]: I0424 20:13:58.501667 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-ctptc_2017f643-59f9-484c-aabe-6af06168539a/tuned/0.log"