Apr 22 18:46:27.882564 ip-10-0-137-19 systemd[1]: Starting Kubernetes Kubelet... Apr 22 18:46:28.333693 ip-10-0-137-19 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.333693 ip-10-0-137-19 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 18:46:28.333693 ip-10-0-137-19 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.333693 ip-10-0-137-19 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 18:46:28.333693 ip-10-0-137-19 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 18:46:28.334428 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.334341 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 18:46:28.336628 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336614 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.336628 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336628 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336632 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336635 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336638 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336641 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336644 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336646 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336649 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336652 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336654 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336657 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336660 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336662 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336665 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336668 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336675 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336678 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336681 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336685 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.336692 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336689 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336692 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336695 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336698 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336702 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336705 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336708 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336710 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336713 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336715 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336718 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336721 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336724 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336726 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336729 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336731 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336734 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336736 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336738 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336741 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.337145 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336743 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336747 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336763 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336766 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336769 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336772 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336774 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336777 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336779 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336782 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336786 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336789 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336792 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336795 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336798 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336801 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336804 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336807 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336810 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.337714 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336812 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336815 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336817 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336820 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336822 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336825 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336827 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336830 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336832 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336836 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336839 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336841 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336844 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336846 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336849 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336851 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336854 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336857 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336860 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336862 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.338175 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336865 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336868 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336870 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336872 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336875 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336878 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.336880 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337253 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337271 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337275 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337278 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337280 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337283 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337286 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337289 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337292 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337295 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337297 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337300 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337302 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.338679 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337305 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337307 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337310 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337313 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337315 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337318 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337320 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337322 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337325 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337328 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337331 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337334 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337336 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337339 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337342 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337344 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337347 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337349 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337352 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337355 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.339160 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337358 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337360 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337363 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337365 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337370 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337373 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337376 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337379 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337382 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337384 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337387 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337390 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337392 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337395 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337397 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337400 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337402 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337406 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337408 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.339768 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337411 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337413 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337416 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337420 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337429 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337432 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337435 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337437 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337440 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337442 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337445 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337447 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337450 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337452 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337455 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337458 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337460 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337463 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337465 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337468 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.340223 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337471 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337473 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337476 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337478 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337481 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337484 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337487 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337490 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337492 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337495 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337497 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337500 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337502 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.337505 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338808 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338817 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338827 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338837 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338841 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338845 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338849 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 18:46:28.340725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338853 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338856 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338859 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338863 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338866 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338870 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338873 2579 flags.go:64] FLAG: --cgroup-root="" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338876 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338879 2579 flags.go:64] FLAG: --client-ca-file="" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338882 2579 flags.go:64] FLAG: --cloud-config="" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338885 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338888 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338894 2579 flags.go:64] FLAG: --cluster-domain="" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338897 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338900 2579 flags.go:64] FLAG: --config-dir="" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338903 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338906 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338910 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338913 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338916 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338920 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338923 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338926 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338929 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338932 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 18:46:28.341243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338935 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338939 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338942 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338945 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338953 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338957 2579 flags.go:64] FLAG: --enable-server="true" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338960 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338967 2579 flags.go:64] FLAG: --event-burst="100" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338971 2579 flags.go:64] FLAG: --event-qps="50" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338973 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338977 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338980 2579 flags.go:64] FLAG: --eviction-hard="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338983 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338994 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.338997 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339000 2579 flags.go:64] FLAG: --eviction-soft="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339003 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339006 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339009 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339012 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339015 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339018 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339021 2579 flags.go:64] FLAG: --feature-gates="" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339024 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339027 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 18:46:28.341863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339030 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339034 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339037 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339040 2579 flags.go:64] FLAG: --help="false" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339042 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339046 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339049 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339052 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339055 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339058 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339062 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339064 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339075 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339078 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339081 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339084 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339088 2579 flags.go:64] FLAG: --kube-reserved="" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339091 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339093 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339096 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339099 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339102 2579 flags.go:64] FLAG: --lock-file="" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339105 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339107 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 18:46:28.342489 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339111 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339116 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339119 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339122 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339125 2579 flags.go:64] FLAG: --logging-format="text" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339127 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339131 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339134 2579 flags.go:64] FLAG: --manifest-url="" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339137 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339141 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339144 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339148 2579 flags.go:64] FLAG: --max-pods="110" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339151 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339154 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339157 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339160 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339167 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339170 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339174 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339183 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339186 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339195 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339199 2579 flags.go:64] FLAG: --pod-cidr="" Apr 22 18:46:28.343144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339202 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339208 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339210 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339213 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339216 2579 flags.go:64] FLAG: --port="10250" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339219 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339222 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-06729cacdd1d1447c" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339226 2579 flags.go:64] FLAG: --qos-reserved="" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339229 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339232 2579 flags.go:64] FLAG: --register-node="true" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339235 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339238 2579 flags.go:64] FLAG: --register-with-taints="" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339242 2579 flags.go:64] FLAG: --registry-burst="10" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339244 2579 flags.go:64] FLAG: --registry-qps="5" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339247 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339250 2579 flags.go:64] FLAG: --reserved-memory="" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339253 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339275 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339278 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339281 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339284 2579 flags.go:64] FLAG: --runonce="false" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339287 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339290 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339293 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339296 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339299 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 18:46:28.343719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339303 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339306 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339309 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339312 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339315 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339321 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339324 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339327 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339330 2579 flags.go:64] FLAG: --system-cgroups="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339333 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339338 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339341 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339344 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339351 2579 flags.go:64] FLAG: --tls-min-version="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339354 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339356 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339359 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339362 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339365 2579 flags.go:64] FLAG: --v="2" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339370 2579 flags.go:64] FLAG: --version="false" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339374 2579 flags.go:64] FLAG: --vmodule="" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339378 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.339381 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339482 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.344376 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339486 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339489 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339492 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339495 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339497 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339499 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339504 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339507 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339511 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339514 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339518 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339521 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339524 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339526 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339530 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339532 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339535 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339538 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339540 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.344964 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339543 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339545 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339548 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339551 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339553 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339556 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339559 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339562 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339564 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339566 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339569 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339571 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339574 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339576 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339579 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339582 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339585 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339588 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339590 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.345490 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339593 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339595 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339598 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339600 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339603 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339605 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339608 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339610 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339613 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339621 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339624 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339626 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339629 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339631 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339634 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339637 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339641 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339643 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339646 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339648 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339651 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.346497 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339653 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339656 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339658 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339661 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339663 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339666 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339668 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339671 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339673 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339675 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339678 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339681 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339683 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339686 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339688 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339691 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339694 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339696 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339699 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339701 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.347398 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339704 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339707 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339709 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339714 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339718 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.339721 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.347992 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.340257 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.348049 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.348066 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348116 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348120 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348123 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348127 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348130 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348133 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348136 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348139 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348142 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348145 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348148 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348152 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348154 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348157 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348159 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348162 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348166 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.348164 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348169 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348172 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348175 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348178 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348180 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348183 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348186 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348188 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348191 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348193 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348196 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348199 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348201 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348204 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348206 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348209 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348211 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348214 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348216 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348219 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.348682 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348221 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348223 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348226 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348228 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348231 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348233 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348238 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348242 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348245 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348248 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348250 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348254 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348257 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348272 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348276 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348279 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348282 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348284 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348287 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.349171 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348290 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348293 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348295 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348298 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348300 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348303 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348306 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348309 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348311 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348314 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348316 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348319 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348321 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348326 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348329 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348332 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348334 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348337 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348340 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348343 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.349668 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348346 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348348 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348351 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348354 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348356 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348359 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348362 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348365 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348368 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348370 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.348375 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348493 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348499 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348502 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348505 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 18:46:28.350151 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348507 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348510 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348513 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348516 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348518 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348521 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348524 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348527 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348529 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348532 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348534 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348537 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348539 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348542 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348544 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348547 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348549 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348552 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348555 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348557 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 18:46:28.350582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348560 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348562 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348565 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348567 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348572 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348576 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348579 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348582 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348585 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348587 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348590 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348593 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348596 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348598 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348601 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348603 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348606 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348608 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348611 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 18:46:28.351074 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348613 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348616 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348618 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348621 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348623 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348626 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348628 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348631 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348634 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348636 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348639 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348641 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348644 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348647 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348649 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348652 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348654 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348657 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348660 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348663 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 18:46:28.351566 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348665 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348668 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348671 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348673 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348676 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348678 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348682 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348685 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348688 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348691 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348694 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348696 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348699 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348702 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348704 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348707 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348709 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348712 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348714 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 18:46:28.352063 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348717 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348719 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348722 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:28.348724 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.348729 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.349436 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 18:46:28.352569 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.352442 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 18:46:28.353390 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.353376 2579 server.go:1019] "Starting client certificate rotation" Apr 22 18:46:28.353495 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.353478 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:28.353545 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.353531 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 18:46:28.376129 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.376108 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:28.379008 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.378980 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 18:46:28.394948 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.394930 2579 log.go:25] "Validated CRI v1 runtime API" Apr 22 18:46:28.401220 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.401203 2579 log.go:25] "Validated CRI v1 image API" Apr 22 18:46:28.402557 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.402542 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 18:46:28.407823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.407801 2579 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c6f6e594-650e-441b-b0b9-eac429c1b9b3:/dev/nvme0n1p4 f8ecda1d-1784-4996-8846-9729e3dd2047:/dev/nvme0n1p3] Apr 22 18:46:28.407890 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.407821 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 18:46:28.410117 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.410094 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:28.416007 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.415897 2579 manager.go:217] Machine: {Timestamp:2026-04-22 18:46:28.413353195 +0000 UTC m=+0.412927086 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099829 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2a493faeb3996ae1c61ff2eba3e179 SystemUUID:ec2a493f-aeb3-996a-e1c6-1ff2eba3e179 BootID:4ebcb127-3aa7-468b-a446-97326658718f Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:92:f1:b5:e9:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:92:f1:b5:e9:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:f2:af:aa:48:a5:4c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 18:46:28.416602 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.416591 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 18:46:28.416731 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.416719 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 18:46:28.418461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.418436 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 18:46:28.418604 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.418463 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-19.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 18:46:28.418648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.418613 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 18:46:28.418648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.418622 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 18:46:28.418648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.418635 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:28.419373 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.419361 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 18:46:28.420752 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.420742 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:28.420877 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.420868 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 18:46:28.423237 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.423227 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 22 18:46:28.423293 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.423242 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 18:46:28.423293 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.423254 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 18:46:28.423381 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.423311 2579 kubelet.go:397] "Adding apiserver pod source" Apr 22 18:46:28.423381 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.423320 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 18:46:28.424363 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.424350 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:28.424401 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.424369 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 18:46:28.427973 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.427955 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 18:46:28.429810 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.429797 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 18:46:28.431213 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431195 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431221 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431231 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431239 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431246 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431254 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431279 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431288 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431296 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 18:46:28.431300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431302 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 18:46:28.431556 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431312 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 18:46:28.431556 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.431320 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 18:46:28.432197 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.432187 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 18:46:28.432197 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.432197 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 18:46:28.435775 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.435760 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 18:46:28.435859 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.435798 2579 server.go:1295] "Started kubelet" Apr 22 18:46:28.435913 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.435887 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 18:46:28.436017 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.435899 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 18:46:28.436017 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.435960 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 18:46:28.436578 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.436556 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-19.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 18:46:28.436692 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.436671 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 18:46:28.436743 ip-10-0-137-19 systemd[1]: Started Kubernetes Kubelet. Apr 22 18:46:28.436837 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.436763 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 18:46:28.437620 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.437569 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 18:46:28.438951 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.438935 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 22 18:46:28.442317 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.442292 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:28.442864 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.442848 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 18:46:28.443512 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.443455 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 18:46:28.443512 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.443456 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 18:46:28.443512 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.443482 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 18:46:28.443825 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.443639 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 22 18:46:28.443825 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.443650 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 22 18:46:28.444114 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.444095 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.446419 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446027 2579 factory.go:55] Registering systemd factory Apr 22 18:46:28.446419 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446053 2579 factory.go:223] Registration of the systemd container factory successfully Apr 22 18:46:28.446560 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446502 2579 factory.go:153] Registering CRI-O factory Apr 22 18:46:28.446560 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446517 2579 factory.go:223] Registration of the crio container factory successfully Apr 22 18:46:28.446656 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446569 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 18:46:28.446656 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446597 2579 factory.go:103] Registering Raw factory Apr 22 18:46:28.446656 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.446612 2579 manager.go:1196] Started watching for new ooms in manager Apr 22 18:46:28.447203 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.447091 2579 manager.go:319] Starting recovery of all containers Apr 22 18:46:28.448852 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.448823 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-19.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 18:46:28.449129 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.449106 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 18:46:28.450906 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.449245 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-19.ec2.internal.18a8c23597ef8bf7 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-19.ec2.internal,UID:ip-10-0-137-19.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-19.ec2.internal,},FirstTimestamp:2026-04-22 18:46:28.435774455 +0000 UTC m=+0.435348344,LastTimestamp:2026-04-22 18:46:28.435774455 +0000 UTC m=+0.435348344,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-19.ec2.internal,}" Apr 22 18:46:28.460940 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.460920 2579 manager.go:324] Recovery completed Apr 22 18:46:28.465173 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.465158 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.467877 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.467852 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.467955 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.467888 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.467955 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.467899 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.468426 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.468410 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 18:46:28.468426 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.468422 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 18:46:28.468518 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.468439 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 22 18:46:28.469944 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.469876 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-19.ec2.internal.18a8c23599d95c0e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-19.ec2.internal,UID:ip-10-0-137-19.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-137-19.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-137-19.ec2.internal,},FirstTimestamp:2026-04-22 18:46:28.46787483 +0000 UTC m=+0.467448718,LastTimestamp:2026-04-22 18:46:28.46787483 +0000 UTC m=+0.467448718,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-19.ec2.internal,}" Apr 22 18:46:28.470613 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.470599 2579 policy_none.go:49] "None policy: Start" Apr 22 18:46:28.470674 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.470620 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 18:46:28.470674 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.470634 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 22 18:46:28.475295 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.475276 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-srqdx" Apr 22 18:46:28.483371 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.483349 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-srqdx" Apr 22 18:46:28.483730 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.483662 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-19.ec2.internal.18a8c23599d9a601 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-19.ec2.internal,UID:ip-10-0-137-19.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-137-19.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-137-19.ec2.internal,},FirstTimestamp:2026-04-22 18:46:28.467893761 +0000 UTC m=+0.467467648,LastTimestamp:2026-04-22 18:46:28.467893761 +0000 UTC m=+0.467467648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-19.ec2.internal,}" Apr 22 18:46:28.517895 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.517879 2579 manager.go:341] "Starting Device Plugin manager" Apr 22 18:46:28.518042 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.517913 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 18:46:28.518042 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.517924 2579 server.go:85] "Starting device plugin registration server" Apr 22 18:46:28.518170 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.518159 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 18:46:28.518217 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.518171 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 18:46:28.518313 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.518297 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 18:46:28.518384 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.518372 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 18:46:28.518430 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.518384 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 18:46:28.518908 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.518892 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 18:46:28.518983 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.518926 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.537068 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.537045 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 18:46:28.538386 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.538371 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 18:46:28.538479 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.538400 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 18:46:28.538479 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.538421 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 18:46:28.538479 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.538434 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 18:46:28.538609 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.538524 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 18:46:28.541091 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.541075 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:28.618725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.618641 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.619902 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.619884 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.620031 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.619922 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.620031 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.619936 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.620031 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.619965 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.629534 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.629511 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.629534 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.629535 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-19.ec2.internal\": node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.639150 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.639127 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal"] Apr 22 18:46:28.639232 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.639193 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.640002 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.639983 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.640081 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.640013 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.640081 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.640023 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.640928 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.640912 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.641215 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641204 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.641377 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641363 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.641448 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641400 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.641873 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641856 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.641941 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641856 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.641941 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641910 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.641941 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641922 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.642033 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641889 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.642033 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.641966 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.643099 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.643085 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.643158 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.643109 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 18:46:28.643711 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.643695 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientMemory" Apr 22 18:46:28.643792 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.643729 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 18:46:28.643792 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.643743 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeHasSufficientPID" Apr 22 18:46:28.644603 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.644589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.644657 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.644610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.644657 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.644630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75deb02a1840e8f4bbd2a0c4ef3a9ce4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-19.ec2.internal\" (UID: \"75deb02a1840e8f4bbd2a0c4ef3a9ce4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.669026 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.669006 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-19.ec2.internal\" not found" node="ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.673474 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.673453 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-19.ec2.internal\" not found" node="ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.741517 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.741485 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.744789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.744848 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.744848 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744814 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75deb02a1840e8f4bbd2a0c4ef3a9ce4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-19.ec2.internal\" (UID: \"75deb02a1840e8f4bbd2a0c4ef3a9ce4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.744848 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/75deb02a1840e8f4bbd2a0c4ef3a9ce4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-19.ec2.internal\" (UID: \"75deb02a1840e8f4bbd2a0c4ef3a9ce4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.744949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744858 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.744949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.744867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a98f83e5fe1a17abcd4731f69cb0d908-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal\" (UID: \"a98f83e5fe1a17abcd4731f69cb0d908\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.841995 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.841950 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.942703 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:28.942671 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:28.971076 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.971050 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:28.975782 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:28.975634 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:29.043503 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.043461 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.143916 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.143879 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.244470 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.244393 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.344866 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.344839 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.353949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.353144 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 18:46:29.353949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.353322 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 18:46:29.403305 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.403254 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:29.430056 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:29.430019 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda98f83e5fe1a17abcd4731f69cb0d908.slice/crio-531bdda379d5385c34c4d73b380c7ce31b2f34ba579c3787ee47efd2063d8fbd WatchSource:0}: Error finding container 531bdda379d5385c34c4d73b380c7ce31b2f34ba579c3787ee47efd2063d8fbd: Status 404 returned error can't find the container with id 531bdda379d5385c34c4d73b380c7ce31b2f34ba579c3787ee47efd2063d8fbd Apr 22 18:46:29.433694 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.433680 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:46:29.442728 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.442711 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 18:46:29.445868 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.445849 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.454298 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.454276 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 18:46:29.475649 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.475629 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvrjg" Apr 22 18:46:29.483386 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.483361 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-wvrjg" Apr 22 18:46:29.484852 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.484835 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 18:41:28 +0000 UTC" deadline="2027-12-10 10:32:03.697373527 +0000 UTC" Apr 22 18:46:29.484914 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.484852 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14319h45m34.212523737s" Apr 22 18:46:29.498979 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:29.498960 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75deb02a1840e8f4bbd2a0c4ef3a9ce4.slice/crio-006909308b3529c78ae02bcc90db56de324fc8eb9ed12ef414561f1d1fd0de86 WatchSource:0}: Error finding container 006909308b3529c78ae02bcc90db56de324fc8eb9ed12ef414561f1d1fd0de86: Status 404 returned error can't find the container with id 006909308b3529c78ae02bcc90db56de324fc8eb9ed12ef414561f1d1fd0de86 Apr 22 18:46:29.541301 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.541236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" event={"ID":"75deb02a1840e8f4bbd2a0c4ef3a9ce4","Type":"ContainerStarted","Data":"006909308b3529c78ae02bcc90db56de324fc8eb9ed12ef414561f1d1fd0de86"} Apr 22 18:46:29.542079 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.542056 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" event={"ID":"a98f83e5fe1a17abcd4731f69cb0d908","Type":"ContainerStarted","Data":"531bdda379d5385c34c4d73b380c7ce31b2f34ba579c3787ee47efd2063d8fbd"} Apr 22 18:46:29.546371 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.546358 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.584889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.584865 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:29.647401 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.647371 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.747934 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.747859 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.848425 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.848383 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.949171 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:29.949139 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-19.ec2.internal\" not found" Apr 22 18:46:29.987335 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:29.987253 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.043849 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.043706 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" Apr 22 18:46:30.060062 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.059999 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:30.061165 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.060934 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" Apr 22 18:46:30.070534 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.070509 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 18:46:30.424500 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.424411 2579 apiserver.go:52] "Watching apiserver" Apr 22 18:46:30.435727 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.435702 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 18:46:30.437852 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.437828 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-jnbvq","kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal","openshift-dns/node-resolver-8vvlg","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal","openshift-multus/multus-z6889","openshift-network-operator/iptables-alerter-mbb5v","openshift-ovn-kubernetes/ovnkube-node-j89vb","kube-system/konnectivity-agent-xwvlb","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6","openshift-cluster-node-tuning-operator/tuned-cmsx5","openshift-image-registry/node-ca-sdfvz","openshift-multus/multus-additional-cni-plugins-k6ptp","openshift-multus/network-metrics-daemon-n2rv2"] Apr 22 18:46:30.440760 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.440711 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.441710 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.441690 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.441825 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.441805 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z6889" Apr 22 18:46:30.442925 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.442908 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.443709 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.443678 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-fvffw\"" Apr 22 18:46:30.443709 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.443689 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.443863 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.443692 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.444501 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.444479 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.444671 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.444651 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.444866 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.444849 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.444938 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.444866 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lmkzb\"" Apr 22 18:46:30.444997 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.444964 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445289 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9whsv\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445340 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445344 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445386 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445512 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445618 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-g9xd4\"" Apr 22 18:46:30.445713 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445671 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.446150 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.445859 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.447120 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.447081 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.447474 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.447434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.449855 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.449829 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.450546 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.450524 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 18:46:30.450704 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.450682 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 18:46:30.450877 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.450861 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 18:46:30.450983 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.450966 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 18:46:30.452189 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452154 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sbgr7\"" Apr 22 18:46:30.452300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452204 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 18:46:30.452395 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452304 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-pd4b9\"" Apr 22 18:46:30.452457 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452371 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.452457 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452425 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 18:46:30.452536 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452482 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 18:46:30.452725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452699 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.452931 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.452911 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8lqh6\"" Apr 22 18:46:30.453090 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:30.453157 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.453124 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:30.453333 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453274 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-sys\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453333 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-multus\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-etc-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453358 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-netd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453382 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-systemd\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-socket-dir-parent\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453451 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44ws7\" (UniqueName: \"kubernetes.io/projected/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-kube-api-access-44ws7\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.453493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453475 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-var-lib-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-bin\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453523 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-k8s-cni-cncf-io\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453547 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-conf-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453569 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-slash\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-netns\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453628 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453669 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-modprobe-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453709 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-run\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453745 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-tuned\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453800 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blc8f\" (UniqueName: \"kubernetes.io/projected/c6c1c2a6-e381-4e17-902c-12dbd344a377-kube-api-access-blc8f\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.453860 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453826 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-systemd-units\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453849 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2w24s\" (UniqueName: \"kubernetes.io/projected/adf86e5d-38b3-4766-b4e8-e9f7a2380707-kube-api-access-2w24s\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453909 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf18fe1-67fc-415f-b637-f1d3a5343441-hosts-file\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453931 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-os-release\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-hostroot\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.453998 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2z69\" (UniqueName: \"kubernetes.io/projected/6ebbe8a2-e463-407a-a400-add3d4b5438a-kube-api-access-v2z69\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454021 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-systemd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovn-node-metrics-cert\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-conf\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-system-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-kubernetes\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-lib-modules\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454202 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-daemon-config\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454247 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-kubelet\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454304 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-node-log\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454330 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.454434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-iptables-alerter-script\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454360 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-host-slash\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454384 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysconfig\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454406 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdf18fe1-67fc-415f-b637-f1d3a5343441-tmp-dir\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-etc-kubernetes\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454456 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454491 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-env-overrides\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454526 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454544 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-tmp\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454581 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-cni-binary-copy\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454597 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-netns\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454615 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-config\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454631 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-script-lib\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454659 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl65g\" (UniqueName: \"kubernetes.io/projected/fdf18fe1-67fc-415f-b637-f1d3a5343441-kube-api-access-rl65g\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-cnibin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454730 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-bin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454754 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-host\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.455330 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454801 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-kubelet\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454845 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-var-lib-kubelet\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454892 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-multus-certs\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454928 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-ovn\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.454954 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-log-socket\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.455891 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.456009 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.455947 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:30.456782 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.456761 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 18:46:30.456871 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.456857 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 18:46:30.457154 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.457088 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 18:46:30.457258 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.457193 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 18:46:30.457258 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.457233 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 18:46:30.457400 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.457377 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4jknq\"" Apr 22 18:46:30.457400 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.457393 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zv54v\"" Apr 22 18:46:30.483997 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.483961 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:29 +0000 UTC" deadline="2027-09-29 19:41:07.507630448 +0000 UTC" Apr 22 18:46:30.483997 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.483997 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12600h54m37.023638188s" Apr 22 18:46:30.545006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.544980 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 18:46:30.555363 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44ws7\" (UniqueName: \"kubernetes.io/projected/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-kube-api-access-44ws7\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.555530 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555376 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-var-lib-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555530 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-bin\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555530 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555453 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-var-lib-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555530 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555459 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-bin\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555530 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555500 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555541 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-k8s-cni-cncf-io\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-conf-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555591 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-k8s-cni-cncf-io\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-slash\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-netns\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555645 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-conf-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-slash\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555662 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-netns\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555693 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-sys-fs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555709 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555728 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-modprobe-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-run\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.555783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-tuned\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blc8f\" (UniqueName: \"kubernetes.io/projected/c6c1c2a6-e381-4e17-902c-12dbd344a377-kube-api-access-blc8f\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-systemd-units\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-run\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555848 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2w24s\" (UniqueName: \"kubernetes.io/projected/adf86e5d-38b3-4766-b4e8-e9f7a2380707-kube-api-access-2w24s\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555878 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-system-cni-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555885 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-systemd-units\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555879 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-modprobe-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555952 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf18fe1-67fc-415f-b637-f1d3a5343441-hosts-file\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.555978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-os-release\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-hostroot\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556049 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fdf18fe1-67fc-415f-b637-f1d3a5343441-hosts-file\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-os-release\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556071 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-hostroot\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.556423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556116 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-registration-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-etc-selinux\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556190 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdrs\" (UniqueName: \"kubernetes.io/projected/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kube-api-access-bjdrs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556233 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2z69\" (UniqueName: \"kubernetes.io/projected/6ebbe8a2-e463-407a-a400-add3d4b5438a-kube-api-access-v2z69\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556278 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-systemd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556287 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-systemd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovn-node-metrics-cert\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-device-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-conf\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556489 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-system-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-socket-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-kubernetes\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-lib-modules\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-daemon-config\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdcl9\" (UniqueName: \"kubernetes.io/projected/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-kube-api-access-tdcl9\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556633 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-system-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.557174 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-kubelet\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-node-log\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-kubelet\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556872 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-kubernetes\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-conf\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.556765 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-lib-modules\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557043 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-serviceca\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557144 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-node-log\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557169 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fad9739-db9a-49b1-aece-1696145ac1fb-konnectivity-ca\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-daemon-config\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557203 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-iptables-alerter-script\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557231 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-host-slash\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysconfig\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557308 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdf18fe1-67fc-415f-b637-f1d3a5343441-tmp-dir\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysconfig\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557288 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-host-slash\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.557881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-etc-kubernetes\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557387 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-env-overrides\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557441 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2kc78\" (UniqueName: \"kubernetes.io/projected/4b04d910-b761-4095-a135-7026105ff82f-kube-api-access-2kc78\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-run-ovn-kubernetes\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557467 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fad9739-db9a-49b1-aece-1696145ac1fb-agent-certs\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-tmp\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557523 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-cni-binary-copy\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557550 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-netns\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557574 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-config\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fdf18fe1-67fc-415f-b637-f1d3a5343441-tmp-dir\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-script-lib\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q75c\" (UniqueName: \"kubernetes.io/projected/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-kube-api-access-5q75c\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557662 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rl65g\" (UniqueName: \"kubernetes.io/projected/fdf18fe1-67fc-415f-b637-f1d3a5343441-kube-api-access-rl65g\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-cnibin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557717 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-bin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557741 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-host\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.558651 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557748 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-iptables-alerter-script\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557785 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cnibin\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557837 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-host\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-env-overrides\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-kubelet\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-kubelet\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557900 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-os-release\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557939 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-cnibin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558038 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-host\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558120 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6ebbe8a2-e463-407a-a400-add3d4b5438a-cni-binary-copy\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558210 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-var-lib-kubelet\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.557663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-netns\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558244 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-config\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-multus-certs\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-bin\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.559461 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-run-multus-certs\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558313 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-ovn\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558338 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-cni-dir\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-log-socket\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558408 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-log-socket\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-var-lib-kubelet\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558443 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovnkube-script-lib\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558468 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-run-ovn\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558493 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-etc-kubernetes\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558494 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-sys\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-multus\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-sys\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558561 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-etc-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558580 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-host-var-lib-cni-multus\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-netd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558613 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-etc-openvswitch\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558613 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/adf86e5d-38b3-4766-b4e8-e9f7a2380707-host-cni-netd\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-systemd\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558680 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-systemd\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-sysctl-d\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558716 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-socket-dir-parent\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.558811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6ebbe8a2-e463-407a-a400-add3d4b5438a-multus-socket-dir-parent\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.560009 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-tmp\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.560054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c6c1c2a6-e381-4e17-902c-12dbd344a377-etc-tuned\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.560823 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.560188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/adf86e5d-38b3-4766-b4e8-e9f7a2380707-ovn-node-metrics-cert\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.567546 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.567483 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2z69\" (UniqueName: \"kubernetes.io/projected/6ebbe8a2-e463-407a-a400-add3d4b5438a-kube-api-access-v2z69\") pod \"multus-z6889\" (UID: \"6ebbe8a2-e463-407a-a400-add3d4b5438a\") " pod="openshift-multus/multus-z6889" Apr 22 18:46:30.567546 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.567498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44ws7\" (UniqueName: \"kubernetes.io/projected/6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8-kube-api-access-44ws7\") pod \"iptables-alerter-mbb5v\" (UID: \"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8\") " pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.567892 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.567869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2w24s\" (UniqueName: \"kubernetes.io/projected/adf86e5d-38b3-4766-b4e8-e9f7a2380707-kube-api-access-2w24s\") pod \"ovnkube-node-j89vb\" (UID: \"adf86e5d-38b3-4766-b4e8-e9f7a2380707\") " pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.567954 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.567869 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blc8f\" (UniqueName: \"kubernetes.io/projected/c6c1c2a6-e381-4e17-902c-12dbd344a377-kube-api-access-blc8f\") pod \"tuned-cmsx5\" (UID: \"c6c1c2a6-e381-4e17-902c-12dbd344a377\") " pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.568620 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.568600 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl65g\" (UniqueName: \"kubernetes.io/projected/fdf18fe1-67fc-415f-b637-f1d3a5343441-kube-api-access-rl65g\") pod \"node-resolver-8vvlg\" (UID: \"fdf18fe1-67fc-415f-b637-f1d3a5343441\") " pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.658982 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.658950 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.659164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.658999 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-sys-fs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659031 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-system-cni-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.659164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659057 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.659164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-system-cni-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.659164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659133 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-sys-fs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-registration-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659190 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-registration-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659204 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-etc-selinux\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdrs\" (UniqueName: \"kubernetes.io/projected/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kube-api-access-bjdrs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-device-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-socket-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659332 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdcl9\" (UniqueName: \"kubernetes.io/projected/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-kube-api-access-tdcl9\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659360 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-serviceca\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659373 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-device-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659384 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fad9739-db9a-49b1-aece-1696145ac1fb-konnectivity-ca\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:30.659454 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659443 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2kc78\" (UniqueName: \"kubernetes.io/projected/4b04d910-b761-4095-a135-7026105ff82f-kube-api-access-2kc78\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fad9739-db9a-49b1-aece-1696145ac1fb-agent-certs\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659503 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5q75c\" (UniqueName: \"kubernetes.io/projected/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-kube-api-access-5q75c\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-host\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cnibin\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659581 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659607 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-os-release\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659631 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659648 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-socket-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659653 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659323 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-etc-selinux\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.659756 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659809 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.659835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.159803791 +0000 UTC m=+3.159377669 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:30.660006 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659899 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cnibin\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.659978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-binary-copy\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660081 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-serviceca\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-os-release\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660305 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/6fad9739-db9a-49b1-aece-1696145ac1fb-konnectivity-ca\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-host\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660365 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.660789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.660442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.662566 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.662545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/6fad9739-db9a-49b1-aece-1696145ac1fb-agent-certs\") pod \"konnectivity-agent-xwvlb\" (UID: \"6fad9739-db9a-49b1-aece-1696145ac1fb\") " pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.669767 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.669744 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:30.669880 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.669772 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:30.669880 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.669789 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:30.669880 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:30.669859 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:31.169840353 +0000 UTC m=+3.169414249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:30.671493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.671472 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdcl9\" (UniqueName: \"kubernetes.io/projected/b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797-kube-api-access-tdcl9\") pod \"node-ca-sdfvz\" (UID: \"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797\") " pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.671698 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.671674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdrs\" (UniqueName: \"kubernetes.io/projected/ed7f294c-6d91-4f61-8e43-06d4f9e15e2a-kube-api-access-bjdrs\") pod \"aws-ebs-csi-driver-node-qm2b6\" (UID: \"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.672151 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.672125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q75c\" (UniqueName: \"kubernetes.io/projected/68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b-kube-api-access-5q75c\") pod \"multus-additional-cni-plugins-k6ptp\" (UID: \"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b\") " pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:30.672503 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.672484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2kc78\" (UniqueName: \"kubernetes.io/projected/4b04d910-b761-4095-a135-7026105ff82f-kube-api-access-2kc78\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:30.754270 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.754218 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" Apr 22 18:46:30.760399 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.760375 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8vvlg" Apr 22 18:46:30.768843 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.768821 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 18:46:30.768953 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.768848 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-z6889" Apr 22 18:46:30.774351 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.774326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-mbb5v" Apr 22 18:46:30.781967 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.781948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:30.789600 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.789579 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:30.796148 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.796130 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" Apr 22 18:46:30.802716 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.802701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-sdfvz" Apr 22 18:46:30.809229 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:30.809209 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" Apr 22 18:46:31.152582 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.152553 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded7f294c_6d91_4f61_8e43_06d4f9e15e2a.slice/crio-c9347de036d97cdcbec9f7c54e17eb63121744535961823d1189e437d334891a WatchSource:0}: Error finding container c9347de036d97cdcbec9f7c54e17eb63121744535961823d1189e437d334891a: Status 404 returned error can't find the container with id c9347de036d97cdcbec9f7c54e17eb63121744535961823d1189e437d334891a Apr 22 18:46:31.154227 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.154203 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1af8ee4_e661_4ab0_ac3d_a1dbdbbd3797.slice/crio-6ba3d42f51127e2031007306c615cddbead3c4d325a629196b34a487deaeec75 WatchSource:0}: Error finding container 6ba3d42f51127e2031007306c615cddbead3c4d325a629196b34a487deaeec75: Status 404 returned error can't find the container with id 6ba3d42f51127e2031007306c615cddbead3c4d325a629196b34a487deaeec75 Apr 22 18:46:31.158445 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.158422 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e6f31ad_2c4e_458c_a3d2_d0367bb85bc8.slice/crio-d94d8aa3992ece59813eff2f932b0a70ba9936b04a7e04f6c210d838403fdd90 WatchSource:0}: Error finding container d94d8aa3992ece59813eff2f932b0a70ba9936b04a7e04f6c210d838403fdd90: Status 404 returned error can't find the container with id d94d8aa3992ece59813eff2f932b0a70ba9936b04a7e04f6c210d838403fdd90 Apr 22 18:46:31.159477 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.159445 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fad9739_db9a_49b1_aece_1696145ac1fb.slice/crio-0ce496b83764c86dacf4c38e194e63a8527158af19d8fba88579ab2f0a5927f9 WatchSource:0}: Error finding container 0ce496b83764c86dacf4c38e194e63a8527158af19d8fba88579ab2f0a5927f9: Status 404 returned error can't find the container with id 0ce496b83764c86dacf4c38e194e63a8527158af19d8fba88579ab2f0a5927f9 Apr 22 18:46:31.160187 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.160163 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a8ebb3_0e0f_4205_a22a_a1e895bb8f0b.slice/crio-d37ab6d784ffa5345dea98d9821eabfc424dfd8d2babd2311fe10c3c0525c37f WatchSource:0}: Error finding container d37ab6d784ffa5345dea98d9821eabfc424dfd8d2babd2311fe10c3c0525c37f: Status 404 returned error can't find the container with id d37ab6d784ffa5345dea98d9821eabfc424dfd8d2babd2311fe10c3c0525c37f Apr 22 18:46:31.161602 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.161444 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfdf18fe1_67fc_415f_b637_f1d3a5343441.slice/crio-e362d397181636fd504c57913d8898cfa7c267f12cd156029210bd6e5ef6c50c WatchSource:0}: Error finding container e362d397181636fd504c57913d8898cfa7c267f12cd156029210bd6e5ef6c50c: Status 404 returned error can't find the container with id e362d397181636fd504c57913d8898cfa7c267f12cd156029210bd6e5ef6c50c Apr 22 18:46:31.161911 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.161791 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ebbe8a2_e463_407a_a400_add3d4b5438a.slice/crio-41b4d2ca6c08a9c7b1ecfa9498a367298d1e5e0138f878df7a19c03667183ae7 WatchSource:0}: Error finding container 41b4d2ca6c08a9c7b1ecfa9498a367298d1e5e0138f878df7a19c03667183ae7: Status 404 returned error can't find the container with id 41b4d2ca6c08a9c7b1ecfa9498a367298d1e5e0138f878df7a19c03667183ae7 Apr 22 18:46:31.162453 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.162436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:31.162590 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.162563 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.162648 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.162621 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:32.162602753 +0000 UTC m=+4.162176643 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:31.163023 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.162912 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6c1c2a6_e381_4e17_902c_12dbd344a377.slice/crio-6596cb9f65f849e3adcd09dd84aa807e02f981ce88639fb104a942ef7fc74d23 WatchSource:0}: Error finding container 6596cb9f65f849e3adcd09dd84aa807e02f981ce88639fb104a942ef7fc74d23: Status 404 returned error can't find the container with id 6596cb9f65f849e3adcd09dd84aa807e02f981ce88639fb104a942ef7fc74d23 Apr 22 18:46:31.164192 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:46:31.164164 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podadf86e5d_38b3_4766_b4e8_e9f7a2380707.slice/crio-7205bce064521b559d68b31f5423844ee5838938c758155fc2337062f731cfcb WatchSource:0}: Error finding container 7205bce064521b559d68b31f5423844ee5838938c758155fc2337062f731cfcb: Status 404 returned error can't find the container with id 7205bce064521b559d68b31f5423844ee5838938c758155fc2337062f731cfcb Apr 22 18:46:31.262733 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.262703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:31.262880 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.262861 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:31.262922 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.262886 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:31.262922 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.262898 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:31.262993 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:31.262946 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:32.262930917 +0000 UTC m=+4.262504805 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:31.484616 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.484421 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 18:41:29 +0000 UTC" deadline="2027-10-01 00:10:11.215315296 +0000 UTC" Apr 22 18:46:31.484616 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.484610 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12629h23m39.730708412s" Apr 22 18:46:31.549724 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.549184 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" event={"ID":"75deb02a1840e8f4bbd2a0c4ef3a9ce4","Type":"ContainerStarted","Data":"dfe10d8a62f44a95011dbbdfe0617ef2ce72979774a60642283efa519ae194b3"} Apr 22 18:46:31.554586 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.554504 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"7205bce064521b559d68b31f5423844ee5838938c758155fc2337062f731cfcb"} Apr 22 18:46:31.558702 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.558638 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" event={"ID":"c6c1c2a6-e381-4e17-902c-12dbd344a377","Type":"ContainerStarted","Data":"6596cb9f65f849e3adcd09dd84aa807e02f981ce88639fb104a942ef7fc74d23"} Apr 22 18:46:31.560868 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.560806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6889" event={"ID":"6ebbe8a2-e463-407a-a400-add3d4b5438a","Type":"ContainerStarted","Data":"41b4d2ca6c08a9c7b1ecfa9498a367298d1e5e0138f878df7a19c03667183ae7"} Apr 22 18:46:31.565666 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.565578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerStarted","Data":"d37ab6d784ffa5345dea98d9821eabfc424dfd8d2babd2311fe10c3c0525c37f"} Apr 22 18:46:31.574388 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.574328 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xwvlb" event={"ID":"6fad9739-db9a-49b1-aece-1696145ac1fb","Type":"ContainerStarted","Data":"0ce496b83764c86dacf4c38e194e63a8527158af19d8fba88579ab2f0a5927f9"} Apr 22 18:46:31.579137 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.578294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mbb5v" event={"ID":"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8","Type":"ContainerStarted","Data":"d94d8aa3992ece59813eff2f932b0a70ba9936b04a7e04f6c210d838403fdd90"} Apr 22 18:46:31.590351 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.590302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vvlg" event={"ID":"fdf18fe1-67fc-415f-b637-f1d3a5343441","Type":"ContainerStarted","Data":"e362d397181636fd504c57913d8898cfa7c267f12cd156029210bd6e5ef6c50c"} Apr 22 18:46:31.595378 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.595319 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdfvz" event={"ID":"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797","Type":"ContainerStarted","Data":"6ba3d42f51127e2031007306c615cddbead3c4d325a629196b34a487deaeec75"} Apr 22 18:46:31.597151 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:31.597102 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" event={"ID":"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a","Type":"ContainerStarted","Data":"c9347de036d97cdcbec9f7c54e17eb63121744535961823d1189e437d334891a"} Apr 22 18:46:32.171532 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.171492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:32.171730 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.171677 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:32.171796 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.171742 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:34.171723559 +0000 UTC m=+6.171297450 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:32.273094 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.272649 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:32.273094 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.272835 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:32.273094 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.272854 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:32.273094 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.272867 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:32.273094 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.272923 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:34.272904852 +0000 UTC m=+6.272478732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:32.540979 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.540420 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:32.540979 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.540545 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:32.541876 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.541601 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:32.541876 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:32.541715 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:32.628102 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.628045 2579 generic.go:358] "Generic (PLEG): container finished" podID="a98f83e5fe1a17abcd4731f69cb0d908" containerID="10721fb7ecba3306fa2ecd13571a0e92616017f6a24e48ccf5d44768d12cfa07" exitCode=0 Apr 22 18:46:32.628289 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.628131 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" event={"ID":"a98f83e5fe1a17abcd4731f69cb0d908","Type":"ContainerDied","Data":"10721fb7ecba3306fa2ecd13571a0e92616017f6a24e48ccf5d44768d12cfa07"} Apr 22 18:46:32.645096 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:32.644967 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-19.ec2.internal" podStartSLOduration=2.644949446 podStartE2EDuration="2.644949446s" podCreationTimestamp="2026-04-22 18:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:31.564848704 +0000 UTC m=+3.564422601" watchObservedRunningTime="2026-04-22 18:46:32.644949446 +0000 UTC m=+4.644523347" Apr 22 18:46:33.637434 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:33.637394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" event={"ID":"a98f83e5fe1a17abcd4731f69cb0d908","Type":"ContainerStarted","Data":"0565f367d0ad50ac17d2eb52df834a2dd693e81cee10225ad397a4e41db7c81d"} Apr 22 18:46:34.187630 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:34.187590 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:34.187814 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.187728 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:34.187814 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.187792 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.18777634 +0000 UTC m=+10.187350215 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:34.288771 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:34.288732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:34.288950 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.288895 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:34.288950 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.288915 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:34.288950 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.288928 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:34.289111 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.288987 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.288968441 +0000 UTC m=+10.288542317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:34.539853 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:34.539556 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:34.539853 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.539681 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:34.540066 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:34.539901 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:34.540111 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:34.540073 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:36.539348 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:36.539303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:36.539786 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:36.539378 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:36.539786 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:36.539498 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:36.539786 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:36.539686 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:37.012536 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.012476 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-19.ec2.internal" podStartSLOduration=7.012455918 podStartE2EDuration="7.012455918s" podCreationTimestamp="2026-04-22 18:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:46:33.654333438 +0000 UTC m=+5.653907335" watchObservedRunningTime="2026-04-22 18:46:37.012455918 +0000 UTC m=+9.012029817" Apr 22 18:46:37.012952 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.012929 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hr4mw"] Apr 22 18:46:37.016018 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.015994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.016126 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:37.016068 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:37.112550 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.112515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-dbus\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.112728 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.112565 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-kubelet-config\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.112728 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.112592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.212920 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.212891 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-dbus\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.213105 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.212937 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-kubelet-config\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.213105 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.212964 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.213208 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:37.213121 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:37.213208 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:37.213170 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:37.713153924 +0000 UTC m=+9.712727813 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:37.213208 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.213118 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-kubelet-config\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.213385 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.213216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-dbus\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.717957 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:37.717654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:37.717957 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:37.717854 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:37.717957 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:37.717912 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:38.717892814 +0000 UTC m=+10.717466704 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:38.222466 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.222427 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:38.222653 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.222592 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.222846 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.222662 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.222642788 +0000 UTC m=+18.222216680 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:38.323335 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.323295 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:38.323519 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.323463 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:38.323519 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.323482 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:38.323519 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.323494 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.323720 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.323551 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:46:46.323532819 +0000 UTC m=+18.323106709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:38.539233 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.539147 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:38.539417 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.539326 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:38.539417 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.539340 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:38.540583 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.540393 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:38.540583 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.540449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:38.540583 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.540518 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:38.727081 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:38.726982 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:38.727670 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.727147 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:38.727670 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:38.727212 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:40.727195142 +0000 UTC m=+12.726769033 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:40.542348 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:40.542318 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:40.542808 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:40.542318 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:40.542808 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:40.542439 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:40.542808 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:40.542325 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:40.542808 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:40.542487 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:40.542808 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:40.542591 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:40.740612 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:40.740575 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:40.740799 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:40.740740 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:40.740859 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:40.740821 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:44.740799736 +0000 UTC m=+16.740373634 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:42.539673 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:42.539633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:42.540136 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:42.539777 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:42.540136 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:42.539633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:42.540136 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:42.539868 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:42.540136 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:42.539633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:42.540136 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:42.539931 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:44.539364 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:44.539331 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:44.539813 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:44.539456 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:44.539813 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:44.539332 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:44.539813 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:44.539327 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:44.539813 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:44.539540 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:44.539813 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:44.539727 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:44.770657 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:44.770610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:44.770809 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:44.770720 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:44.770876 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:44.770832 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:46:52.770811726 +0000 UTC m=+24.770385601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:46.280690 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:46.280648 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:46.281136 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.280799 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.281136 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.280866 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.280846831 +0000 UTC m=+34.280420705 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 18:46:46.381988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:46.381951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:46.382160 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.382134 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 18:46:46.382160 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.382156 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 18:46:46.382232 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.382165 2579 projected.go:194] Error preparing data for projected volume kube-api-access-k8gp4 for pod openshift-network-diagnostics/network-check-target-jnbvq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.382232 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.382219 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4 podName:d340dfa0-a9e2-48b1-ad81-8921d5782b2e nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.382201959 +0000 UTC m=+34.381775834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-k8gp4" (UniqueName: "kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4") pod "network-check-target-jnbvq" (UID: "d340dfa0-a9e2-48b1-ad81-8921d5782b2e") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 18:46:46.538885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:46.538808 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:46.538885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:46.538808 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:46.539123 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.538936 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:46.539123 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:46.538812 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:46.539123 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.538989 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:46.539123 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:46.539082 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:48.539575 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.539380 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:48.540135 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:48.539888 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:48.540135 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.539478 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:48.540135 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:48.539976 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:48.540135 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.539458 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:48.540135 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:48.540062 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:48.664691 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.664650 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xwvlb" event={"ID":"6fad9739-db9a-49b1-aece-1696145ac1fb","Type":"ContainerStarted","Data":"d5cdf0908b653057b746755c7f90d1dc04962621a77cb6b2d812b9465fe82034"} Apr 22 18:46:48.665976 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.665948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8vvlg" event={"ID":"fdf18fe1-67fc-415f-b637-f1d3a5343441","Type":"ContainerStarted","Data":"0898efcb45dd30cbad66cc9e366724041fdd03e3e41251b4bbe0588926acbf4f"} Apr 22 18:46:48.667292 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.667254 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-sdfvz" event={"ID":"b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797","Type":"ContainerStarted","Data":"81956be6b4732dc750ed3db08ec92ddc6612722a78c38f0159862b3d4c4ab911"} Apr 22 18:46:48.668627 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.668569 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" event={"ID":"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a","Type":"ContainerStarted","Data":"a99f424375e4fba51c0df42fabc43b2ea8e90aeb58a265ec3cd1e87f996a752b"} Apr 22 18:46:48.670092 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.670055 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"40dd4f1d13ea4d02535cec51134d6ae861dcc110ade804b29b894f257f72f366"} Apr 22 18:46:48.671569 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.671542 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" event={"ID":"c6c1c2a6-e381-4e17-902c-12dbd344a377","Type":"ContainerStarted","Data":"b19094c95768a4b151ae22a723536e6746c642454ddd2696c67fb3554cbeebba"} Apr 22 18:46:48.672993 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.672910 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-z6889" event={"ID":"6ebbe8a2-e463-407a-a400-add3d4b5438a","Type":"ContainerStarted","Data":"3bfeaf0b42b79486427e4e6ef5e102df0f263118e45334246de95da5dd088c9f"} Apr 22 18:46:48.674345 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.674323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerStarted","Data":"adc8e496c5590a095541d688b0f41459cd42d5dc24b05ce76bd68b27c54fc3ea"} Apr 22 18:46:48.679647 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.679601 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xwvlb" podStartSLOduration=3.565474529 podStartE2EDuration="20.679565457s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.162967702 +0000 UTC m=+3.162541586" lastFinishedPulling="2026-04-22 18:46:48.277058634 +0000 UTC m=+20.276632514" observedRunningTime="2026-04-22 18:46:48.679524322 +0000 UTC m=+20.679098216" watchObservedRunningTime="2026-04-22 18:46:48.679565457 +0000 UTC m=+20.679139335" Apr 22 18:46:48.692307 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.692236 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-sdfvz" podStartSLOduration=11.587285353 podStartE2EDuration="20.692216586s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.156517717 +0000 UTC m=+3.156091600" lastFinishedPulling="2026-04-22 18:46:40.261448958 +0000 UTC m=+12.261022833" observedRunningTime="2026-04-22 18:46:48.691629584 +0000 UTC m=+20.691203482" watchObservedRunningTime="2026-04-22 18:46:48.692216586 +0000 UTC m=+20.691790484" Apr 22 18:46:48.704729 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.704672 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8vvlg" podStartSLOduration=3.591883094 podStartE2EDuration="20.704652439s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.164417351 +0000 UTC m=+3.163991231" lastFinishedPulling="2026-04-22 18:46:48.277186688 +0000 UTC m=+20.276760576" observedRunningTime="2026-04-22 18:46:48.703847839 +0000 UTC m=+20.703421738" watchObservedRunningTime="2026-04-22 18:46:48.704652439 +0000 UTC m=+20.704226337" Apr 22 18:46:48.730649 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.730603 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-z6889" podStartSLOduration=3.552858529 podStartE2EDuration="20.73058837s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.164992495 +0000 UTC m=+3.164566375" lastFinishedPulling="2026-04-22 18:46:48.342722334 +0000 UTC m=+20.342296216" observedRunningTime="2026-04-22 18:46:48.730012175 +0000 UTC m=+20.729586071" watchObservedRunningTime="2026-04-22 18:46:48.73058837 +0000 UTC m=+20.730162266" Apr 22 18:46:48.760902 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:48.760853 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-cmsx5" podStartSLOduration=3.6453966859999998 podStartE2EDuration="20.760839137s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.165432892 +0000 UTC m=+3.165006769" lastFinishedPulling="2026-04-22 18:46:48.280875332 +0000 UTC m=+20.280449220" observedRunningTime="2026-04-22 18:46:48.760282223 +0000 UTC m=+20.759856111" watchObservedRunningTime="2026-04-22 18:46:48.760839137 +0000 UTC m=+20.760413064" Apr 22 18:46:49.424712 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.424690 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 18:46:49.532286 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.532179 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T18:46:49.424707947Z","UUID":"b5102ce9-5aaf-4a9c-b3ec-0a6336608c85","Handler":null,"Name":"","Endpoint":""} Apr 22 18:46:49.534497 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.534470 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 18:46:49.534603 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.534504 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 18:46:49.678772 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.678690 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"6a459181f5be0529881a8d820d613e659a4d03a5b57816ff19c1a0e7c301c9be"} Apr 22 18:46:49.678772 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.678729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"6f5c5e2178da9031701f289a7ffc8c25fa9bdd0ceb22f25f53fe5c61cfba8e3c"} Apr 22 18:46:49.678772 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.678741 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"f1a146d32c581ed16f0b69af254df797c127046ee85f43dadca26af5943464dc"} Apr 22 18:46:49.678772 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.678750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"0a8df43c7d589ecf4bebee33268ce59afb57f08efd5e060054ac2bd3cdcca76f"} Apr 22 18:46:49.678772 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.678760 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"3ff0969b9c4835acf069660b3986805a8e83e120b63e8ea6fa16d4715e853467"} Apr 22 18:46:49.679921 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.679894 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="adc8e496c5590a095541d688b0f41459cd42d5dc24b05ce76bd68b27c54fc3ea" exitCode=0 Apr 22 18:46:49.680037 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.679926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"adc8e496c5590a095541d688b0f41459cd42d5dc24b05ce76bd68b27c54fc3ea"} Apr 22 18:46:49.681340 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.681217 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-mbb5v" event={"ID":"6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8","Type":"ContainerStarted","Data":"c0b473848e0f6d1ae1c3f5596f080db0045787618a61d0c77080a6543862ae67"} Apr 22 18:46:49.682685 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.682663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" event={"ID":"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a","Type":"ContainerStarted","Data":"4b030aaef0b42f32e105c435c6e82498373f2f356387b46aa44fa58e87f0f019"} Apr 22 18:46:49.711884 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:49.711845 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-mbb5v" podStartSLOduration=4.595434718 podStartE2EDuration="21.711834508s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.161116461 +0000 UTC m=+3.160690340" lastFinishedPulling="2026-04-22 18:46:48.277516252 +0000 UTC m=+20.277090130" observedRunningTime="2026-04-22 18:46:49.711532802 +0000 UTC m=+21.711106699" watchObservedRunningTime="2026-04-22 18:46:49.711834508 +0000 UTC m=+21.711408404" Apr 22 18:46:50.539541 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:50.539507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:50.539716 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:50.539507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:50.539716 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:50.539636 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:50.539819 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:50.539507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:50.539819 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:50.539719 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:50.539819 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:50.539799 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:50.686989 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:50.686947 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" event={"ID":"ed7f294c-6d91-4f61-8e43-06d4f9e15e2a","Type":"ContainerStarted","Data":"e9fe705c3c0cd3f46793f429c01462ea71f99487ed551a0fc079a99c648f5473"} Apr 22 18:46:50.718867 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:50.718816 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-qm2b6" podStartSLOduration=3.505059111 podStartE2EDuration="22.718797516s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.154331547 +0000 UTC m=+3.153905424" lastFinishedPulling="2026-04-22 18:46:50.368069946 +0000 UTC m=+22.367643829" observedRunningTime="2026-04-22 18:46:50.717089729 +0000 UTC m=+22.716663629" watchObservedRunningTime="2026-04-22 18:46:50.718797516 +0000 UTC m=+22.718371413" Apr 22 18:46:51.009376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:51.009291 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:51.010160 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:51.010140 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:51.692323 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:51.692278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"3e56d229f222ab03106b188e31beb33dbed557e7299591f829d31437d498786c"} Apr 22 18:46:51.692788 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:51.692651 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:51.693065 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:51.693049 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xwvlb" Apr 22 18:46:52.541629 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:52.541596 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:52.541629 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:52.541632 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:52.541866 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:52.541638 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:52.541866 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:52.541699 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:52.541866 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:52.541775 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:52.541866 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:52.541854 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:52.836492 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:52.836410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:52.836982 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:52.836537 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:52.836982 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:52.836603 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret podName:1e505d90-2a43-4f9a-a513-c9f1e8c46ac4 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:08.836584654 +0000 UTC m=+40.836158534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret") pod "global-pull-secret-syncer-hr4mw" (UID: "1e505d90-2a43-4f9a-a513-c9f1e8c46ac4") : object "kube-system"/"original-pull-secret" not registered Apr 22 18:46:53.701923 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:53.701647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" event={"ID":"adf86e5d-38b3-4766-b4e8-e9f7a2380707","Type":"ContainerStarted","Data":"57a51ca859355e806e65e5837182c5f7e4a705bb2fdab42dac2fd86189943c79"} Apr 22 18:46:53.702365 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:53.702176 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:53.702365 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:53.702208 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:53.716927 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:53.716805 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:53.734679 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:53.734182 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" podStartSLOduration=8.338074586 podStartE2EDuration="25.734167238s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.167698774 +0000 UTC m=+3.167272656" lastFinishedPulling="2026-04-22 18:46:48.563791432 +0000 UTC m=+20.563365308" observedRunningTime="2026-04-22 18:46:53.731968174 +0000 UTC m=+25.731542095" watchObservedRunningTime="2026-04-22 18:46:53.734167238 +0000 UTC m=+25.733741134" Apr 22 18:46:54.541453 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.541425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:54.541453 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.541438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:54.541453 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.541425 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:54.541902 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:54.541529 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:54.541902 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:54.541569 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:54.541902 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:54.541639 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:54.705369 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.705331 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="c02fa4e61aa0b5c0cc7e4bc716a8b0dae49016a86e5b19f019ff17449ac85663" exitCode=0 Apr 22 18:46:54.705524 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.705416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"c02fa4e61aa0b5c0cc7e4bc716a8b0dae49016a86e5b19f019ff17449ac85663"} Apr 22 18:46:54.709318 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.708048 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:54.723168 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:54.723138 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:46:55.709492 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.709316 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="3061e07e54f33d8a80d4df0d70bfb70bf2c4498f5f549dd0688588e8262fbb86" exitCode=0 Apr 22 18:46:55.709905 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.709396 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"3061e07e54f33d8a80d4df0d70bfb70bf2c4498f5f549dd0688588e8262fbb86"} Apr 22 18:46:55.849746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.849676 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hr4mw"] Apr 22 18:46:55.849892 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.849808 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:55.849949 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:55.849917 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:55.852377 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.852356 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2rv2"] Apr 22 18:46:55.852499 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.852445 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:55.852549 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:55.852509 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:55.855340 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.855320 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jnbvq"] Apr 22 18:46:55.855412 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:55.855396 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:55.855491 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:55.855473 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:56.715257 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:56.715227 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="ae09308b4f07ecc82adb00fc921f6bdf5eb21f8e2a8c52814190cf86a4e9cc5b" exitCode=0 Apr 22 18:46:56.715604 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:56.715300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"ae09308b4f07ecc82adb00fc921f6bdf5eb21f8e2a8c52814190cf86a4e9cc5b"} Apr 22 18:46:57.539104 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:57.539019 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:57.539104 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:57.539068 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:57.539367 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:57.539129 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:57.539367 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:57.539240 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:57.539367 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:57.539307 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:46:57.539502 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:57.539365 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:59.538943 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:59.538911 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:46:59.539637 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:59.538953 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:46:59.539637 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:46:59.538918 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:46:59.539637 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:59.539071 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:46:59.539637 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:59.539122 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-hr4mw" podUID="1e505d90-2a43-4f9a-a513-c9f1e8c46ac4" Apr 22 18:46:59.539637 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:46:59.539195 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jnbvq" podUID="d340dfa0-a9e2-48b1-ad81-8921d5782b2e" Apr 22 18:47:01.280005 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.279923 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-19.ec2.internal" event="NodeReady" Apr 22 18:47:01.280579 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.280077 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 18:47:01.317847 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.317812 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d7787fd58-v5n8p"] Apr 22 18:47:01.354969 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.354937 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9"] Apr 22 18:47:01.355155 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.355120 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.358071 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.357919 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 18:47:01.358071 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.357945 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 18:47:01.358071 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.357952 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vkfzv\"" Apr 22 18:47:01.358071 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.358038 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 18:47:01.370096 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.370072 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2"] Apr 22 18:47:01.370277 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.370246 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.373288 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.373249 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 18:47:01.374510 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.374489 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 18:47:01.374621 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.374532 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-m5qgq\"" Apr 22 18:47:01.379790 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.379766 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 18:47:01.394744 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.394717 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl"] Apr 22 18:47:01.394898 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.394867 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.397498 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.397475 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 18:47:01.397498 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.397489 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 18:47:01.397677 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.397477 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 18:47:01.397677 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.397480 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 18:47:01.410398 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.410379 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj"] Apr 22 18:47:01.410556 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.410538 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.413444 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.413421 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 18:47:01.413444 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.413438 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 18:47:01.413601 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.413431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 18:47:01.413601 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.413509 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 18:47:01.431136 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431114 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d7787fd58-v5n8p"] Apr 22 18:47:01.431291 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9"] Apr 22 18:47:01.431291 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431156 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2"] Apr 22 18:47:01.431291 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431168 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj"] Apr 22 18:47:01.431291 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431180 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl"] Apr 22 18:47:01.431291 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431194 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9wfwv"] Apr 22 18:47:01.431675 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.431650 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.434294 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.434245 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-kbfzk\"" Apr 22 18:47:01.434408 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.434301 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 18:47:01.449146 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.449105 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-22c27"] Apr 22 18:47:01.449311 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.449279 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.452010 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.451980 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 18:47:01.452010 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.452001 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:47:01.452198 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.452036 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 18:47:01.452198 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.451990 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 18:47:01.462332 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.462305 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9wfwv"] Apr 22 18:47:01.462332 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.462333 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-22c27"] Apr 22 18:47:01.462508 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.462450 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.465351 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.465326 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 18:47:01.465491 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.465355 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:47:01.465564 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.465496 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 18:47:01.510556 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th7g9\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510605 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510632 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfljd\" (UniqueName: \"kubernetes.io/projected/3a76dcd7-19d7-40d1-8cd5-dc766c360423-kube-api-access-gfljd\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a76dcd7-19d7-40d1-8cd5-dc766c360423-tmp\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510703 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.510746 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510771 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/477c8ebb-278f-4a30-9476-d0758c0fce10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3a76dcd7-19d7-40d1-8cd5-dc766c360423-klusterlet-config\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510867 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wddtx\" (UniqueName: \"kubernetes.io/projected/463cb4ff-3a7f-462a-8187-d77573dd3e54-kube-api-access-wddtx\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510892 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510943 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510973 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.510996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.511088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.511038 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/463cb4ff-3a7f-462a-8187-d77573dd3e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.511452 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.511102 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.511452 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.511160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.511452 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.511189 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.538841 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.538762 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:47:01.538995 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.538868 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:47:01.539119 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.539079 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:47:01.542049 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542021 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 18:47:01.542164 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542065 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-c77rp\"" Apr 22 18:47:01.542244 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542228 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 18:47:01.542349 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 18:47:01.542404 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542357 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:47:01.542404 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.542364 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 18:47:01.611792 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611762 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611806 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/463cb4ff-3a7f-462a-8187-d77573dd3e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a76dcd7-19d7-40d1-8cd5-dc766c360423-tmp\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611937 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.611977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.611962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.612344 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.612052 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:01.612344 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.612139 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.11211858 +0000 UTC m=+34.111692458 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:01.612508 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612485 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.612574 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612509 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpdf\" (UniqueName: \"kubernetes.io/projected/efb38099-2266-40a5-ba8f-a7759b82543b-kube-api-access-pwpdf\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.612574 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th7g9\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.612676 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.612676 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612628 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.612676 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612650 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.612676 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efb38099-2266-40a5-ba8f-a7759b82543b-config-volume\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612693 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3a76dcd7-19d7-40d1-8cd5-dc766c360423-tmp\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/463cb4ff-3a7f-462a-8187-d77573dd3e54-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.612788 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.612802 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612796 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.612861 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.1128407 +0000 UTC m=+34.112414580 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:01.612885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612851 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612905 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612940 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/477c8ebb-278f-4a30-9476-d0758c0fce10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.612980 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wddtx\" (UniqueName: \"kubernetes.io/projected/463cb4ff-3a7f-462a-8187-d77573dd3e54-kube-api-access-wddtx\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613037 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbhs8\" (UniqueName: \"kubernetes.io/projected/85daa685-3b8e-4641-b717-08df86db79f9-kube-api-access-wbhs8\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613069 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613105 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.613200 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/efb38099-2266-40a5-ba8f-a7759b82543b-tmp-dir\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.613540 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.613540 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfljd\" (UniqueName: \"kubernetes.io/projected/3a76dcd7-19d7-40d1-8cd5-dc766c360423-kube-api-access-gfljd\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.613540 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613327 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfjr5\" (UniqueName: \"kubernetes.io/projected/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-kube-api-access-vfjr5\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.613540 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613388 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3a76dcd7-19d7-40d1-8cd5-dc766c360423-klusterlet-config\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.613732 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/477c8ebb-278f-4a30-9476-d0758c0fce10-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:01.613732 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.613721 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.614373 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.614295 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.616978 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.616932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.616978 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.616975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.617144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.617040 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.617144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.617048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.617420 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.616989 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-ca\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.617657 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.617632 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3a76dcd7-19d7-40d1-8cd5-dc766c360423-klusterlet-config\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.617776 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.617761 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/463cb4ff-3a7f-462a-8187-d77573dd3e54-hub\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.621290 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.621223 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.622716 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.622694 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfljd\" (UniqueName: \"kubernetes.io/projected/3a76dcd7-19d7-40d1-8cd5-dc766c360423-kube-api-access-gfljd\") pod \"klusterlet-addon-workmgr-697466dcfb-6mng2\" (UID: \"3a76dcd7-19d7-40d1-8cd5-dc766c360423\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.623295 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.623255 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wddtx\" (UniqueName: \"kubernetes.io/projected/463cb4ff-3a7f-462a-8187-d77573dd3e54-kube-api-access-wddtx\") pod \"cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl\" (UID: \"463cb4ff-3a7f-462a-8187-d77573dd3e54\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.623505 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.623486 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th7g9\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:01.705699 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.705662 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:01.714921 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.714713 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfjr5\" (UniqueName: \"kubernetes.io/projected/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-kube-api-access-vfjr5\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.715063 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.714960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.715063 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715039 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.715188 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.715128 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:01.715188 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.715139 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:01.715188 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715170 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpdf\" (UniqueName: \"kubernetes.io/projected/efb38099-2266-40a5-ba8f-a7759b82543b-kube-api-access-pwpdf\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.715357 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.715193 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.21517239 +0000 UTC m=+34.214746285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:01.715357 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:01.715278 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:02.215244384 +0000 UTC m=+34.214818260 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:01.715357 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efb38099-2266-40a5-ba8f-a7759b82543b-config-volume\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.715484 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715370 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.715484 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715444 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbhs8\" (UniqueName: \"kubernetes.io/projected/85daa685-3b8e-4641-b717-08df86db79f9-kube-api-access-wbhs8\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.715561 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/efb38099-2266-40a5-ba8f-a7759b82543b-tmp-dir\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.715853 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715812 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/efb38099-2266-40a5-ba8f-a7759b82543b-tmp-dir\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.715959 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.715903 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/efb38099-2266-40a5-ba8f-a7759b82543b-config-volume\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.718440 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.718414 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.726737 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.726711 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfjr5\" (UniqueName: \"kubernetes.io/projected/50b1a4fa-9a2f-4da4-a1e6-29794d728c75-kube-api-access-vfjr5\") pod \"managed-serviceaccount-addon-agent-7b9db494bf-49pcj\" (UID: \"50b1a4fa-9a2f-4da4-a1e6-29794d728c75\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:01.726868 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.726813 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpdf\" (UniqueName: \"kubernetes.io/projected/efb38099-2266-40a5-ba8f-a7759b82543b-kube-api-access-pwpdf\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:01.726943 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.726921 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbhs8\" (UniqueName: \"kubernetes.io/projected/85daa685-3b8e-4641-b717-08df86db79f9-kube-api-access-wbhs8\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:01.729559 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.729524 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:47:01.742309 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:01.742281 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:47:02.119381 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.119338 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:02.119574 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.119402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:02.119574 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.119507 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:02.119574 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.119540 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:02.119574 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.119551 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:02.119777 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.119589 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.119566793 +0000 UTC m=+35.119140685 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:02.119777 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.119610 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.119599489 +0000 UTC m=+35.119173369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:02.219862 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.219830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:02.220052 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.219899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:02.220052 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.219995 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:02.220052 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.220045 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:02.220201 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.220069 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.220049561 +0000 UTC m=+35.219623450 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:02.220201 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.220114 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:03.220096862 +0000 UTC m=+35.219670740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:02.320781 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.320744 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:47:02.321187 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.320909 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:02.321187 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:02.320994 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:47:34.320972286 +0000 UTC m=+66.320546178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : secret "metrics-daemon-secret" not found Apr 22 18:47:02.421400 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.421293 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:47:02.424250 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.424199 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8gp4\" (UniqueName: \"kubernetes.io/projected/d340dfa0-a9e2-48b1-ad81-8921d5782b2e-kube-api-access-k8gp4\") pod \"network-check-target-jnbvq\" (UID: \"d340dfa0-a9e2-48b1-ad81-8921d5782b2e\") " pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:47:02.449897 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.449852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:47:02.502731 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.502670 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj"] Apr 22 18:47:02.506848 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.506100 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2"] Apr 22 18:47:02.507249 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.507201 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl"] Apr 22 18:47:02.592783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.592756 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jnbvq"] Apr 22 18:47:02.621476 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:47:02.621440 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod463cb4ff_3a7f_462a_8187_d77573dd3e54.slice/crio-75386a484d7bb07945e80c4846193fcbf9e2ebd8db0ae873271490e607a6c8c5 WatchSource:0}: Error finding container 75386a484d7bb07945e80c4846193fcbf9e2ebd8db0ae873271490e607a6c8c5: Status 404 returned error can't find the container with id 75386a484d7bb07945e80c4846193fcbf9e2ebd8db0ae873271490e607a6c8c5 Apr 22 18:47:02.622792 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:47:02.622767 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a76dcd7_19d7_40d1_8cd5_dc766c360423.slice/crio-41ea5495f68851c48473f37f4e4e1428a81ce96d75cab48f82dd6ef62796466d WatchSource:0}: Error finding container 41ea5495f68851c48473f37f4e4e1428a81ce96d75cab48f82dd6ef62796466d: Status 404 returned error can't find the container with id 41ea5495f68851c48473f37f4e4e1428a81ce96d75cab48f82dd6ef62796466d Apr 22 18:47:02.623370 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:47:02.623354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd340dfa0_a9e2_48b1_ad81_8921d5782b2e.slice/crio-2c9af8f14f13a2179375d1b0ea333070d3c0539737309de97ec1b40b57ccf23f WatchSource:0}: Error finding container 2c9af8f14f13a2179375d1b0ea333070d3c0539737309de97ec1b40b57ccf23f: Status 404 returned error can't find the container with id 2c9af8f14f13a2179375d1b0ea333070d3c0539737309de97ec1b40b57ccf23f Apr 22 18:47:02.728157 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.728098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" event={"ID":"50b1a4fa-9a2f-4da4-a1e6-29794d728c75","Type":"ContainerStarted","Data":"baea1e6e1d1a1cc2c4962b6b1aa6ca35e81d711f362fb8588c79acddd9a38ea5"} Apr 22 18:47:02.729224 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.729194 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerStarted","Data":"75386a484d7bb07945e80c4846193fcbf9e2ebd8db0ae873271490e607a6c8c5"} Apr 22 18:47:02.730239 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.730201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jnbvq" event={"ID":"d340dfa0-a9e2-48b1-ad81-8921d5782b2e","Type":"ContainerStarted","Data":"2c9af8f14f13a2179375d1b0ea333070d3c0539737309de97ec1b40b57ccf23f"} Apr 22 18:47:02.731117 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:02.731096 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" event={"ID":"3a76dcd7-19d7-40d1-8cd5-dc766c360423","Type":"ContainerStarted","Data":"41ea5495f68851c48473f37f4e4e1428a81ce96d75cab48f82dd6ef62796466d"} Apr 22 18:47:03.127445 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.127403 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:03.127445 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.127452 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:03.127663 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.127568 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:03.127663 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.127658 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:05.12763987 +0000 UTC m=+37.127213748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:03.127734 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.127570 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:03.127734 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.127682 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:03.127734 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.127721 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:05.127710303 +0000 UTC m=+37.127284179 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:03.228120 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.228087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:03.228316 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.228133 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:03.228316 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.228246 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:03.228316 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.228299 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:03.228471 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.228336 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:05.228315419 +0000 UTC m=+37.227889312 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:03.228471 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:03.228358 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:05.228348349 +0000 UTC m=+37.227922233 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:03.740107 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.739117 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="cad5deee4eadf0592f6960631a4881ac9920f0ad319a5f514b2a5e7bfb20e69a" exitCode=0 Apr 22 18:47:03.740107 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:03.739373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"cad5deee4eadf0592f6960631a4881ac9920f0ad319a5f514b2a5e7bfb20e69a"} Apr 22 18:47:04.749030 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:04.748347 2579 generic.go:358] "Generic (PLEG): container finished" podID="68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b" containerID="f287105a7d7f667275a7dc28208eb34793485ea3245792fbc0f41b1acb487935" exitCode=0 Apr 22 18:47:04.749030 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:04.748403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerDied","Data":"f287105a7d7f667275a7dc28208eb34793485ea3245792fbc0f41b1acb487935"} Apr 22 18:47:05.149446 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.149359 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:05.149446 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.149423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:05.149666 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.149554 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:05.149666 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.149571 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:05.149666 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.149628 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.149609216 +0000 UTC m=+41.149183095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:05.149827 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.149695 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:05.149827 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.149728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.149718015 +0000 UTC m=+41.149291892 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.250687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.250842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.250977 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.251048 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.251029966 +0000 UTC m=+41.250603842 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.251464 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:05.251568 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:05.251512 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:09.251497596 +0000 UTC m=+41.251071484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:05.754472 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.754434 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" event={"ID":"68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b","Type":"ContainerStarted","Data":"a4c7cd424809033ea95a0da8150d2957151dee3bd1fb3e595d375c7052d01d46"} Apr 22 18:47:05.782872 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:05.782820 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-k6ptp" podStartSLOduration=6.28872471 podStartE2EDuration="37.782802443s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:46:31.162086132 +0000 UTC m=+3.161660009" lastFinishedPulling="2026-04-22 18:47:02.656163865 +0000 UTC m=+34.655737742" observedRunningTime="2026-04-22 18:47:05.780720277 +0000 UTC m=+37.780294173" watchObservedRunningTime="2026-04-22 18:47:05.782802443 +0000 UTC m=+37.782376342" Apr 22 18:47:08.880962 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:08.880918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:47:08.884959 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:08.884930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/1e505d90-2a43-4f9a-a513-c9f1e8c46ac4-original-pull-secret\") pod \"global-pull-secret-syncer-hr4mw\" (UID: \"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4\") " pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:47:09.061645 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:09.061613 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hr4mw" Apr 22 18:47:09.183278 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:09.183234 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:09.183472 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:09.183304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:09.183472 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.183410 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:09.183577 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.183490 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.183469899 +0000 UTC m=+49.183043793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:09.183577 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.183411 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:09.183577 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.183523 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:09.183577 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.183573 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.183558363 +0000 UTC m=+49.183132242 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:09.284715 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:09.284672 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:09.284881 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:09.284735 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:09.284881 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.284837 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:09.285010 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.284909 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.284888291 +0000 UTC m=+49.284462189 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:09.285010 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.284836 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:09.285010 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:09.285005 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:17.28498347 +0000 UTC m=+49.284557364 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:10.711438 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.711414 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hr4mw"] Apr 22 18:47:10.718792 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:47:10.717889 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e505d90_2a43_4f9a_a513_c9f1e8c46ac4.slice/crio-b1b8cdbc30e62d59e319d86fe4ea0522f770495a97220793d2d6be84c4cd01db WatchSource:0}: Error finding container b1b8cdbc30e62d59e319d86fe4ea0522f770495a97220793d2d6be84c4cd01db: Status 404 returned error can't find the container with id b1b8cdbc30e62d59e319d86fe4ea0522f770495a97220793d2d6be84c4cd01db Apr 22 18:47:10.768906 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.768874 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" event={"ID":"50b1a4fa-9a2f-4da4-a1e6-29794d728c75","Type":"ContainerStarted","Data":"cd476e2ef246e2c51b9588df6de83ac5a11f6add631c813d71eb38e466ce6295"} Apr 22 18:47:10.770339 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.770314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerStarted","Data":"1412a05f29e6a8d66cc9e6a8593694ad001c7c82694ccd7d190515d0ef79c2c9"} Apr 22 18:47:10.771325 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.771299 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jnbvq" event={"ID":"d340dfa0-a9e2-48b1-ad81-8921d5782b2e","Type":"ContainerStarted","Data":"92d5dd5b292fd4aa627a8c15a168b5372b7be3c6c87ac066c4e78595ed03d0e6"} Apr 22 18:47:10.771484 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.771471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:47:10.772499 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.772478 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" event={"ID":"3a76dcd7-19d7-40d1-8cd5-dc766c360423","Type":"ContainerStarted","Data":"bc3f47ba3441a33b63fcad632dafbe1a61b93a285d49897ef479ba18fb7eae8f"} Apr 22 18:47:10.773649 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.773626 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hr4mw" event={"ID":"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4","Type":"ContainerStarted","Data":"b1b8cdbc30e62d59e319d86fe4ea0522f770495a97220793d2d6be84c4cd01db"} Apr 22 18:47:10.793039 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.792986 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" podStartSLOduration=24.827090829 podStartE2EDuration="32.792970258s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:47:02.620352797 +0000 UTC m=+34.619926678" lastFinishedPulling="2026-04-22 18:47:10.586232231 +0000 UTC m=+42.585806107" observedRunningTime="2026-04-22 18:47:10.791773629 +0000 UTC m=+42.791347527" watchObservedRunningTime="2026-04-22 18:47:10.792970258 +0000 UTC m=+42.792544152" Apr 22 18:47:10.814243 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:10.814142 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-jnbvq" podStartSLOduration=34.86024596 podStartE2EDuration="42.814127132s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:47:02.633327112 +0000 UTC m=+34.632900996" lastFinishedPulling="2026-04-22 18:47:10.587208279 +0000 UTC m=+42.586782168" observedRunningTime="2026-04-22 18:47:10.813538357 +0000 UTC m=+42.813112280" watchObservedRunningTime="2026-04-22 18:47:10.814127132 +0000 UTC m=+42.813701022" Apr 22 18:47:11.793809 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:11.793729 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" podStartSLOduration=25.824451593 podStartE2EDuration="33.793709452s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:47:02.633211214 +0000 UTC m=+34.632785103" lastFinishedPulling="2026-04-22 18:47:10.602469083 +0000 UTC m=+42.602042962" observedRunningTime="2026-04-22 18:47:11.793473114 +0000 UTC m=+43.793047012" watchObservedRunningTime="2026-04-22 18:47:11.793709452 +0000 UTC m=+43.793283350" Apr 22 18:47:15.785701 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:15.785664 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hr4mw" event={"ID":"1e505d90-2a43-4f9a-a513-c9f1e8c46ac4","Type":"ContainerStarted","Data":"c716519c508410d90ca0b36da402918f0ea948808f4fde3b9cdc8dadd6c520c4"} Apr 22 18:47:15.787345 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:15.787314 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerStarted","Data":"96d729fcf569d586ecd5e1042f8f43db6b2388e609f773b5fb95f0505efc0f36"} Apr 22 18:47:15.787345 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:15.787339 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerStarted","Data":"c21704111a39ee9f2d4c0def3fa72a199df7840abc1d09ac3d27f1f5500a4029"} Apr 22 18:47:15.802858 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:15.802818 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hr4mw" podStartSLOduration=35.450781411 podStartE2EDuration="39.80280639s" podCreationTimestamp="2026-04-22 18:46:36 +0000 UTC" firstStartedPulling="2026-04-22 18:47:10.723128385 +0000 UTC m=+42.722702267" lastFinishedPulling="2026-04-22 18:47:15.075153371 +0000 UTC m=+47.074727246" observedRunningTime="2026-04-22 18:47:15.802002963 +0000 UTC m=+47.801576860" watchObservedRunningTime="2026-04-22 18:47:15.80280639 +0000 UTC m=+47.802380284" Apr 22 18:47:15.820802 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:15.820761 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" podStartSLOduration=25.658780487 podStartE2EDuration="37.820748066s" podCreationTimestamp="2026-04-22 18:46:38 +0000 UTC" firstStartedPulling="2026-04-22 18:47:02.633054908 +0000 UTC m=+34.632628794" lastFinishedPulling="2026-04-22 18:47:14.795022493 +0000 UTC m=+46.794596373" observedRunningTime="2026-04-22 18:47:15.81995447 +0000 UTC m=+47.819528370" watchObservedRunningTime="2026-04-22 18:47:15.820748066 +0000 UTC m=+47.820321962" Apr 22 18:47:17.252083 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:17.252045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:17.252083 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:17.252087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:17.252518 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.252186 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:17.252518 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.252187 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:17.252518 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.252259 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.252244018 +0000 UTC m=+65.251817893 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:17.252518 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.252195 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:17.252518 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.252315 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.252303358 +0000 UTC m=+65.251877233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:17.352961 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:17.352932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:17.353098 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:17.352972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:17.353098 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.353075 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:17.353098 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.353091 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:17.353195 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.353132 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.353116463 +0000 UTC m=+65.352690343 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:17.353195 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:17.353146 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:47:33.353140038 +0000 UTC m=+65.352713913 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:21.776314 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:21.776254 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:21.776923 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:21.776901 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:47:26.726404 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:26.726377 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j89vb" Apr 22 18:47:33.278482 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:33.278440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:47:33.278482 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:33.278488 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:47:33.278892 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.278585 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:47:33.278892 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.278661 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:05.278643897 +0000 UTC m=+97.278217772 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:47:33.278892 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.278590 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:47:33.278892 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.278684 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:47:33.278892 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.278734 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:48:05.278721826 +0000 UTC m=+97.278295700 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:47:33.379575 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:33.379533 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:47:33.379575 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:33.379580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:47:33.379775 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.379673 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:47:33.379775 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.379702 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:47:33.379775 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.379762 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:48:05.379717592 +0000 UTC m=+97.379291470 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:47:33.379775 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:33.379777 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:48:05.37977028 +0000 UTC m=+97.379344154 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:47:34.387641 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:34.387604 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:47:34.388031 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:34.387716 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:47:34.388031 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:47:34.387770 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:48:38.387757288 +0000 UTC m=+130.387331163 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : secret "metrics-daemon-secret" not found Apr 22 18:47:41.777918 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:47:41.777887 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jnbvq" Apr 22 18:48:05.323859 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:48:05.323813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:48:05.323863 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.323967 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.324038 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:09.324023754 +0000 UTC m=+161.323597635 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.323970 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.324079 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:48:05.324333 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.324127 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:49:09.324115991 +0000 UTC m=+161.323689867 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:48:05.424988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:48:05.424906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:48:05.424988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:48:05.424955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:48:05.425162 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.425059 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:48:05.425162 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.425095 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:48:05.425162 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.425125 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:09.425107935 +0000 UTC m=+161.424681825 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:48:05.425162 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:05.425141 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:49:09.425135506 +0000 UTC m=+161.424709384 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:48:38.468533 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:48:38.468498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:48:38.469059 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:38.468630 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 18:48:38.469059 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:48:38.468692 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs podName:4b04d910-b761-4095-a135-7026105ff82f nodeName:}" failed. No retries permitted until 2026-04-22 18:50:40.468676163 +0000 UTC m=+252.468250037 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs") pod "network-metrics-daemon-n2rv2" (UID: "4b04d910-b761-4095-a135-7026105ff82f") : secret "metrics-daemon-secret" not found Apr 22 18:49:03.523872 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:03.523840 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8vvlg_fdf18fe1-67fc-415f-b637-f1d3a5343441/dns-node-resolver/0.log" Apr 22 18:49:04.370134 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:04.370096 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" podUID="de03fc44-c0d2-4f77-9197-77718a6f0aef" Apr 22 18:49:04.382356 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:04.382326 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" podUID="477c8ebb-278f-4a30-9476-d0758c0fce10" Apr 22 18:49:04.473274 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:04.473223 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9wfwv" podUID="85daa685-3b8e-4641-b717-08df86db79f9" Apr 22 18:49:04.478417 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:04.478394 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-22c27" podUID="efb38099-2266-40a5-ba8f-a7759b82543b" Apr 22 18:49:04.556857 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:04.556824 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-n2rv2" podUID="4b04d910-b761-4095-a135-7026105ff82f" Apr 22 18:49:04.924340 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:04.924312 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdfvz_b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797/node-ca/0.log" Apr 22 18:49:05.031547 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:05.031520 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:49:05.031691 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:05.031563 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:49:05.031691 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:05.031578 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:49:09.418560 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:09.418516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:49:09.418560 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:09.418563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") pod \"image-registry-6d7787fd58-v5n8p\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:49:09.419125 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.418670 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 18:49:09.419125 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.418680 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 18:49:09.419125 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.418692 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6d7787fd58-v5n8p: secret "image-registry-tls" not found Apr 22 18:49:09.419125 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.418743 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls podName:de03fc44-c0d2-4f77-9197-77718a6f0aef nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.418724767 +0000 UTC m=+283.418298655 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls") pod "image-registry-6d7787fd58-v5n8p" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef") : secret "image-registry-tls" not found Apr 22 18:49:09.419125 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.418758 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert podName:477c8ebb-278f-4a30-9476-d0758c0fce10 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.418750218 +0000 UTC m=+283.418324093 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-ckmv9" (UID: "477c8ebb-278f-4a30-9476-d0758c0fce10") : secret "networking-console-plugin-cert" not found Apr 22 18:49:09.519574 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:09.519540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:49:09.519755 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:09.519591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:49:09.519755 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.519638 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 18:49:09.519755 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.519693 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert podName:85daa685-3b8e-4641-b717-08df86db79f9 nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.519679076 +0000 UTC m=+283.519252951 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert") pod "ingress-canary-9wfwv" (UID: "85daa685-3b8e-4641-b717-08df86db79f9") : secret "canary-serving-cert" not found Apr 22 18:49:09.519755 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.519746 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 18:49:09.519936 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:09.519808 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls podName:efb38099-2266-40a5-ba8f-a7759b82543b nodeName:}" failed. No retries permitted until 2026-04-22 18:51:11.519791217 +0000 UTC m=+283.519365093 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls") pod "dns-default-22c27" (UID: "efb38099-2266-40a5-ba8f-a7759b82543b") : secret "dns-default-metrics-tls" not found Apr 22 18:49:11.045447 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.045411 2579 generic.go:358] "Generic (PLEG): container finished" podID="3a76dcd7-19d7-40d1-8cd5-dc766c360423" containerID="bc3f47ba3441a33b63fcad632dafbe1a61b93a285d49897ef479ba18fb7eae8f" exitCode=1 Apr 22 18:49:11.045862 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.045473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" event={"ID":"3a76dcd7-19d7-40d1-8cd5-dc766c360423","Type":"ContainerDied","Data":"bc3f47ba3441a33b63fcad632dafbe1a61b93a285d49897ef479ba18fb7eae8f"} Apr 22 18:49:11.045862 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.045837 2579 scope.go:117] "RemoveContainer" containerID="bc3f47ba3441a33b63fcad632dafbe1a61b93a285d49897ef479ba18fb7eae8f" Apr 22 18:49:11.048380 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.048118 2579 generic.go:358] "Generic (PLEG): container finished" podID="50b1a4fa-9a2f-4da4-a1e6-29794d728c75" containerID="cd476e2ef246e2c51b9588df6de83ac5a11f6add631c813d71eb38e466ce6295" exitCode=255 Apr 22 18:49:11.048380 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.048165 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" event={"ID":"50b1a4fa-9a2f-4da4-a1e6-29794d728c75","Type":"ContainerDied","Data":"cd476e2ef246e2c51b9588df6de83ac5a11f6add631c813d71eb38e466ce6295"} Apr 22 18:49:11.048519 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.048482 2579 scope.go:117] "RemoveContainer" containerID="cd476e2ef246e2c51b9588df6de83ac5a11f6add631c813d71eb38e466ce6295" Apr 22 18:49:11.706671 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.706633 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:49:11.742483 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.742458 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" Apr 22 18:49:11.776851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:11.776827 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:49:12.052191 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:12.052089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" event={"ID":"3a76dcd7-19d7-40d1-8cd5-dc766c360423","Type":"ContainerStarted","Data":"a292820a568cbb1b93a800a751dc494f0d4f03fa354f1495c43a35d78b711ced"} Apr 22 18:49:12.052663 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:12.052316 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:49:12.053051 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:12.053027 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-697466dcfb-6mng2" Apr 22 18:49:12.053804 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:12.053785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-7b9db494bf-49pcj" event={"ID":"50b1a4fa-9a2f-4da4-a1e6-29794d728c75","Type":"ContainerStarted","Data":"0766b9ab96f703bf129bb56d4f157c6623ea3958db5aecdb3fc5ecd3a8256171"} Apr 22 18:49:17.539034 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:17.538970 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22c27" Apr 22 18:49:18.540077 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:18.540038 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:49:23.839132 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.839102 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-j7lrq"] Apr 22 18:49:23.842353 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.842329 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:23.845198 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.845177 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 18:49:23.846872 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.846847 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 18:49:23.846981 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.846874 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 18:49:23.846981 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.846879 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n2rqf\"" Apr 22 18:49:23.846981 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.846848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 18:49:23.853580 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.853561 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j7lrq"] Apr 22 18:49:23.927875 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.927838 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb42db1a-d501-4d76-be24-a264eb8f5075-data-volume\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:23.927875 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.927879 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:23.928058 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.927945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hcvj\" (UniqueName: \"kubernetes.io/projected/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-api-access-4hcvj\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:23.928058 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.927991 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb42db1a-d501-4d76-be24-a264eb8f5075-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:23.928128 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:23.928083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb42db1a-d501-4d76-be24-a264eb8f5075-crio-socket\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028553 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028518 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb42db1a-d501-4d76-be24-a264eb8f5075-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028703 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb42db1a-d501-4d76-be24-a264eb8f5075-crio-socket\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028703 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028623 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb42db1a-d501-4d76-be24-a264eb8f5075-data-volume\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028703 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028645 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028703 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/fb42db1a-d501-4d76-be24-a264eb8f5075-crio-socket\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028703 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hcvj\" (UniqueName: \"kubernetes.io/projected/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-api-access-4hcvj\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.028967 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.028945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/fb42db1a-d501-4d76-be24-a264eb8f5075-data-volume\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.029197 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.029181 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.030798 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.030773 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/fb42db1a-d501-4d76-be24-a264eb8f5075-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.040308 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.040282 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hcvj\" (UniqueName: \"kubernetes.io/projected/fb42db1a-d501-4d76-be24-a264eb8f5075-kube-api-access-4hcvj\") pod \"insights-runtime-extractor-j7lrq\" (UID: \"fb42db1a-d501-4d76-be24-a264eb8f5075\") " pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.151102 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.151022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-j7lrq" Apr 22 18:49:24.263730 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:24.263704 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-j7lrq"] Apr 22 18:49:24.265966 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:49:24.265938 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb42db1a_d501_4d76_be24_a264eb8f5075.slice/crio-48005ac00765911bf4edc3e4dbc2ba708781964fc02cebc31ddf69acab637fac WatchSource:0}: Error finding container 48005ac00765911bf4edc3e4dbc2ba708781964fc02cebc31ddf69acab637fac: Status 404 returned error can't find the container with id 48005ac00765911bf4edc3e4dbc2ba708781964fc02cebc31ddf69acab637fac Apr 22 18:49:25.085875 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:25.085835 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7lrq" event={"ID":"fb42db1a-d501-4d76-be24-a264eb8f5075","Type":"ContainerStarted","Data":"c63cc40e1c4efa8ca2d5f5a4ef0794fc9829ffc37502aede446d31d9ca48a57e"} Apr 22 18:49:25.085875 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:25.085870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7lrq" event={"ID":"fb42db1a-d501-4d76-be24-a264eb8f5075","Type":"ContainerStarted","Data":"6ac2b26d46b3fd86e6d4059b388ee17609cd1aac004ccbc0cadfdce06cf23919"} Apr 22 18:49:25.085875 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:25.085879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7lrq" event={"ID":"fb42db1a-d501-4d76-be24-a264eb8f5075","Type":"ContainerStarted","Data":"48005ac00765911bf4edc3e4dbc2ba708781964fc02cebc31ddf69acab637fac"} Apr 22 18:49:27.094029 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:27.093996 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-j7lrq" event={"ID":"fb42db1a-d501-4d76-be24-a264eb8f5075","Type":"ContainerStarted","Data":"252462a37641be34087df81af148aaa4ccd7ad321a7aec323d993704501329b4"} Apr 22 18:49:27.112525 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:27.112475 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-j7lrq" podStartSLOduration=2.18163936 podStartE2EDuration="4.112460334s" podCreationTimestamp="2026-04-22 18:49:23 +0000 UTC" firstStartedPulling="2026-04-22 18:49:24.317694078 +0000 UTC m=+176.317267956" lastFinishedPulling="2026-04-22 18:49:26.248515055 +0000 UTC m=+178.248088930" observedRunningTime="2026-04-22 18:49:27.111694405 +0000 UTC m=+179.111268302" watchObservedRunningTime="2026-04-22 18:49:27.112460334 +0000 UTC m=+179.112034230" Apr 22 18:49:38.748355 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.748326 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7m87w"] Apr 22 18:49:38.751464 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.751448 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.754318 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.754298 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 18:49:38.754420 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.754346 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 18:49:38.755564 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.755543 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 18:49:38.755564 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.755562 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 18:49:38.755724 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.755567 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 18:49:38.755724 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.755563 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-mrmnz\"" Apr 22 18:49:38.755724 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.755601 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 18:49:38.849078 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849049 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-wtmp\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849228 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-root\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849228 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849136 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849228 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849160 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k42gl\" (UniqueName: \"kubernetes.io/projected/52eb41d6-6bcb-4547-bcb0-bb79ad417873-kube-api-access-k42gl\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849228 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-metrics-client-ca\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849408 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-sys\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849408 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849313 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-accelerators-collector-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849408 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.849408 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.849361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-textfile\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949650 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-wtmp\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949667 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-root\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949736 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k42gl\" (UniqueName: \"kubernetes.io/projected/52eb41d6-6bcb-4547-bcb0-bb79ad417873-kube-api-access-k42gl\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-root\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949759 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-wtmp\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.949821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-metrics-client-ca\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949825 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-sys\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949864 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-accelerators-collector-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949902 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/52eb41d6-6bcb-4547-bcb0-bb79ad417873-sys\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.949932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-textfile\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:38.950009 2579 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 18:49:38.950169 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:38.950063 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls podName:52eb41d6-6bcb-4547-bcb0-bb79ad417873 nodeName:}" failed. No retries permitted until 2026-04-22 18:49:39.45004426 +0000 UTC m=+191.449618134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls") pod "node-exporter-7m87w" (UID: "52eb41d6-6bcb-4547-bcb0-bb79ad417873") : secret "node-exporter-tls" not found Apr 22 18:49:38.950536 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.950321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-textfile\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950536 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.950463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-metrics-client-ca\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.950599 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.950557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-accelerators-collector-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.952090 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.952074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:38.959842 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:38.959822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k42gl\" (UniqueName: \"kubernetes.io/projected/52eb41d6-6bcb-4547-bcb0-bb79ad417873-kube-api-access-k42gl\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:39.453813 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:39.453761 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:39.456041 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:39.456020 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/52eb41d6-6bcb-4547-bcb0-bb79ad417873-node-exporter-tls\") pod \"node-exporter-7m87w\" (UID: \"52eb41d6-6bcb-4547-bcb0-bb79ad417873\") " pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:39.660304 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:39.660253 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7m87w" Apr 22 18:49:39.667904 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:49:39.667877 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52eb41d6_6bcb_4547_bcb0_bb79ad417873.slice/crio-229d21d26b341c3604d6921fd4236f22b9ff989114f424371c1495e8ef4d843d WatchSource:0}: Error finding container 229d21d26b341c3604d6921fd4236f22b9ff989114f424371c1495e8ef4d843d: Status 404 returned error can't find the container with id 229d21d26b341c3604d6921fd4236f22b9ff989114f424371c1495e8ef4d843d Apr 22 18:49:40.126297 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:40.126240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7m87w" event={"ID":"52eb41d6-6bcb-4547-bcb0-bb79ad417873","Type":"ContainerStarted","Data":"229d21d26b341c3604d6921fd4236f22b9ff989114f424371c1495e8ef4d843d"} Apr 22 18:49:41.129635 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:41.129600 2579 generic.go:358] "Generic (PLEG): container finished" podID="52eb41d6-6bcb-4547-bcb0-bb79ad417873" containerID="22c4d87a2b547bce8056dce1350e407b89c61b274525ec49e10e89a923459fab" exitCode=0 Apr 22 18:49:41.129635 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:41.129639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7m87w" event={"ID":"52eb41d6-6bcb-4547-bcb0-bb79ad417873","Type":"ContainerDied","Data":"22c4d87a2b547bce8056dce1350e407b89c61b274525ec49e10e89a923459fab"} Apr 22 18:49:42.133473 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:42.133437 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7m87w" event={"ID":"52eb41d6-6bcb-4547-bcb0-bb79ad417873","Type":"ContainerStarted","Data":"b11a36eb26d38e6f4d4703deb1868236f6895a60ee9691f43cc5c5226f3f1e68"} Apr 22 18:49:42.133872 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:42.133481 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7m87w" event={"ID":"52eb41d6-6bcb-4547-bcb0-bb79ad417873","Type":"ContainerStarted","Data":"929f0b588c1d2d482a9d5e4a31db87cec38dc99d767e201c297b5501b0c5f8ec"} Apr 22 18:49:45.833993 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:45.833942 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7m87w" podStartSLOduration=7.072146073 podStartE2EDuration="7.833925928s" podCreationTimestamp="2026-04-22 18:49:38 +0000 UTC" firstStartedPulling="2026-04-22 18:49:39.669754584 +0000 UTC m=+191.669328474" lastFinishedPulling="2026-04-22 18:49:40.431534452 +0000 UTC m=+192.431108329" observedRunningTime="2026-04-22 18:49:42.155296233 +0000 UTC m=+194.154870129" watchObservedRunningTime="2026-04-22 18:49:45.833925928 +0000 UTC m=+197.833499825" Apr 22 18:49:45.834378 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:45.834351 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d7787fd58-v5n8p"] Apr 22 18:49:45.834530 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:49:45.834512 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" podUID="de03fc44-c0d2-4f77-9197-77718a6f0aef" Apr 22 18:49:46.143351 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.143253 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:49:46.147363 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.147323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:49:46.213300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213246 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213300 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213308 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th7g9\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213517 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213336 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213517 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213458 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213621 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213523 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213621 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213565 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213621 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213567 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:46.213621 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213596 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted\") pod \"de03fc44-c0d2-4f77-9197-77718a6f0aef\" (UID: \"de03fc44-c0d2-4f77-9197-77718a6f0aef\") " Apr 22 18:49:46.213832 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213755 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:49:46.213921 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213897 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:49:46.213988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213906 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-trusted-ca\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.213988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.213957 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-certificates\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.215758 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.215730 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:46.215851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.215759 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9" (OuterVolumeSpecName: "kube-api-access-th7g9") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "kube-api-access-th7g9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:46.215851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.215798 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:49:46.215851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.215838 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "de03fc44-c0d2-4f77-9197-77718a6f0aef" (UID: "de03fc44-c0d2-4f77-9197-77718a6f0aef"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:49:46.315320 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.315285 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-installation-pull-secrets\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.315320 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.315313 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/de03fc44-c0d2-4f77-9197-77718a6f0aef-image-registry-private-configuration\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.315320 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.315324 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-bound-sa-token\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.315320 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.315333 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/de03fc44-c0d2-4f77-9197-77718a6f0aef-ca-trust-extracted\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:46.315581 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:46.315343 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-th7g9\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-kube-api-access-th7g9\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:47.145964 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:47.145930 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d7787fd58-v5n8p" Apr 22 18:49:47.185238 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:47.185175 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6d7787fd58-v5n8p"] Apr 22 18:49:47.189736 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:47.189708 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6d7787fd58-v5n8p"] Apr 22 18:49:47.323458 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:47.323427 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/de03fc44-c0d2-4f77-9197-77718a6f0aef-registry-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:49:48.545095 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:48.545063 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de03fc44-c0d2-4f77-9197-77718a6f0aef" path="/var/lib/kubelet/pods/de03fc44-c0d2-4f77-9197-77718a6f0aef/volumes" Apr 22 18:49:51.730970 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:49:51.730890 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" podUID="463cb4ff-3a7f-462a-8187-d77573dd3e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:50:01.731283 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:01.731221 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" podUID="463cb4ff-3a7f-462a-8187-d77573dd3e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:50:11.731420 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:11.731381 2579 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" podUID="463cb4ff-3a7f-462a-8187-d77573dd3e54" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 18:50:11.731842 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:11.731443 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" Apr 22 18:50:11.731898 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:11.731867 2579 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"96d729fcf569d586ecd5e1042f8f43db6b2388e609f773b5fb95f0505efc0f36"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 18:50:11.731945 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:11.731930 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" podUID="463cb4ff-3a7f-462a-8187-d77573dd3e54" containerName="service-proxy" containerID="cri-o://96d729fcf569d586ecd5e1042f8f43db6b2388e609f773b5fb95f0505efc0f36" gracePeriod=30 Apr 22 18:50:12.207199 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:12.207168 2579 generic.go:358] "Generic (PLEG): container finished" podID="463cb4ff-3a7f-462a-8187-d77573dd3e54" containerID="96d729fcf569d586ecd5e1042f8f43db6b2388e609f773b5fb95f0505efc0f36" exitCode=2 Apr 22 18:50:12.207369 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:12.207239 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerDied","Data":"96d729fcf569d586ecd5e1042f8f43db6b2388e609f773b5fb95f0505efc0f36"} Apr 22 18:50:12.207369 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:12.207294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7ddfbcc4f5-7vpcl" event={"ID":"463cb4ff-3a7f-462a-8187-d77573dd3e54","Type":"ContainerStarted","Data":"998251936e862227006519458e4e591cefc9b6812be48b0d48d823fe3d445f95"} Apr 22 18:50:40.547850 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:40.547815 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:50:40.550121 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:40.550098 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4b04d910-b761-4095-a135-7026105ff82f-metrics-certs\") pod \"network-metrics-daemon-n2rv2\" (UID: \"4b04d910-b761-4095-a135-7026105ff82f\") " pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:50:40.743584 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:40.743554 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-9qpsz\"" Apr 22 18:50:40.751290 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:40.751246 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-n2rv2" Apr 22 18:50:40.868026 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:40.868003 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-n2rv2"] Apr 22 18:50:40.870042 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:50:40.870013 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b04d910_b761_4095_a135_7026105ff82f.slice/crio-f143cd41158b6aa80d125e9a2c1763f9b2580e8047a2eb79a56874ed771506fc WatchSource:0}: Error finding container f143cd41158b6aa80d125e9a2c1763f9b2580e8047a2eb79a56874ed771506fc: Status 404 returned error can't find the container with id f143cd41158b6aa80d125e9a2c1763f9b2580e8047a2eb79a56874ed771506fc Apr 22 18:50:41.278581 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:41.278536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2rv2" event={"ID":"4b04d910-b761-4095-a135-7026105ff82f","Type":"ContainerStarted","Data":"f143cd41158b6aa80d125e9a2c1763f9b2580e8047a2eb79a56874ed771506fc"} Apr 22 18:50:42.286322 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:42.286288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2rv2" event={"ID":"4b04d910-b761-4095-a135-7026105ff82f","Type":"ContainerStarted","Data":"dde7b4a7f2717e2953021725bfd51c4a6a0d0df682d28aab38c00be47235e5d8"} Apr 22 18:50:42.286322 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:42.286322 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-n2rv2" event={"ID":"4b04d910-b761-4095-a135-7026105ff82f","Type":"ContainerStarted","Data":"de14b17029f5fddd3c195ab2081b5d4cf0b2492e853f1f003a7a4241a71933a4"} Apr 22 18:50:42.304241 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:50:42.304200 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-n2rv2" podStartSLOduration=253.417693713 podStartE2EDuration="4m14.304189275s" podCreationTimestamp="2026-04-22 18:46:28 +0000 UTC" firstStartedPulling="2026-04-22 18:50:40.871813366 +0000 UTC m=+252.871387258" lastFinishedPulling="2026-04-22 18:50:41.758308925 +0000 UTC m=+253.757882820" observedRunningTime="2026-04-22 18:50:42.302661977 +0000 UTC m=+254.302235875" watchObservedRunningTime="2026-04-22 18:50:42.304189275 +0000 UTC m=+254.303763171" Apr 22 18:51:08.032378 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:51:08.032317 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" podUID="477c8ebb-278f-4a30-9476-d0758c0fce10" Apr 22 18:51:08.032378 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:51:08.032320 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-9wfwv" podUID="85daa685-3b8e-4641-b717-08df86db79f9" Apr 22 18:51:08.359525 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:08.359444 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:51:08.359525 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:08.359503 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:51:11.484629 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.484572 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:51:11.487157 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.487127 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/477c8ebb-278f-4a30-9476-d0758c0fce10-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-ckmv9\" (UID: \"477c8ebb-278f-4a30-9476-d0758c0fce10\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:51:11.585726 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.585682 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:51:11.585909 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.585742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:51:11.587941 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.587919 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/efb38099-2266-40a5-ba8f-a7759b82543b-metrics-tls\") pod \"dns-default-22c27\" (UID: \"efb38099-2266-40a5-ba8f-a7759b82543b\") " pod="openshift-dns/dns-default-22c27" Apr 22 18:51:11.588130 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.588112 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/85daa685-3b8e-4641-b717-08df86db79f9-cert\") pod \"ingress-canary-9wfwv\" (UID: \"85daa685-3b8e-4641-b717-08df86db79f9\") " pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:51:11.663668 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.663634 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-m5qgq\"" Apr 22 18:51:11.664956 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.664938 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-q94s6\"" Apr 22 18:51:11.671159 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.671139 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9wfwv" Apr 22 18:51:11.671232 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.671140 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" Apr 22 18:51:11.800929 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.800907 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9wfwv"] Apr 22 18:51:11.803503 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:51:11.803469 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85daa685_3b8e_4641_b717_08df86db79f9.slice/crio-090ae036f778e1cd3f22b44458be130af82f8a74a22bd200bd1cc7ca7aacbf46 WatchSource:0}: Error finding container 090ae036f778e1cd3f22b44458be130af82f8a74a22bd200bd1cc7ca7aacbf46: Status 404 returned error can't find the container with id 090ae036f778e1cd3f22b44458be130af82f8a74a22bd200bd1cc7ca7aacbf46 Apr 22 18:51:11.816005 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.815984 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9"] Apr 22 18:51:11.818055 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:51:11.818026 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod477c8ebb_278f_4a30_9476_d0758c0fce10.slice/crio-ef773ad1a2fede24cab85caa586e4cefd7926989b8b4d5f1ea16197e116a2c5d WatchSource:0}: Error finding container ef773ad1a2fede24cab85caa586e4cefd7926989b8b4d5f1ea16197e116a2c5d: Status 404 returned error can't find the container with id ef773ad1a2fede24cab85caa586e4cefd7926989b8b4d5f1ea16197e116a2c5d Apr 22 18:51:11.842138 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.842114 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-w2ls6\"" Apr 22 18:51:11.850436 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.850417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-22c27" Apr 22 18:51:11.963187 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:11.963163 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-22c27"] Apr 22 18:51:11.964773 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:51:11.964750 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefb38099_2266_40a5_ba8f_a7759b82543b.slice/crio-13d6542905d96f0e83ee11875c9e76b0fc50356d1303b64c937da44e44a12a87 WatchSource:0}: Error finding container 13d6542905d96f0e83ee11875c9e76b0fc50356d1303b64c937da44e44a12a87: Status 404 returned error can't find the container with id 13d6542905d96f0e83ee11875c9e76b0fc50356d1303b64c937da44e44a12a87 Apr 22 18:51:12.370407 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:12.370373 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" event={"ID":"477c8ebb-278f-4a30-9476-d0758c0fce10","Type":"ContainerStarted","Data":"ef773ad1a2fede24cab85caa586e4cefd7926989b8b4d5f1ea16197e116a2c5d"} Apr 22 18:51:12.371289 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:12.371250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22c27" event={"ID":"efb38099-2266-40a5-ba8f-a7759b82543b","Type":"ContainerStarted","Data":"13d6542905d96f0e83ee11875c9e76b0fc50356d1303b64c937da44e44a12a87"} Apr 22 18:51:12.372090 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:12.372071 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9wfwv" event={"ID":"85daa685-3b8e-4641-b717-08df86db79f9","Type":"ContainerStarted","Data":"090ae036f778e1cd3f22b44458be130af82f8a74a22bd200bd1cc7ca7aacbf46"} Apr 22 18:51:13.375976 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:13.375889 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" event={"ID":"477c8ebb-278f-4a30-9476-d0758c0fce10","Type":"ContainerStarted","Data":"16c15b002034b22b690b2042924e4e44a6aa6701a17516a3f3792c86ef39fd4d"} Apr 22 18:51:13.393855 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:13.393793 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-ckmv9" podStartSLOduration=271.248793651 podStartE2EDuration="4m32.393775669s" podCreationTimestamp="2026-04-22 18:46:41 +0000 UTC" firstStartedPulling="2026-04-22 18:51:11.820085294 +0000 UTC m=+283.819659169" lastFinishedPulling="2026-04-22 18:51:12.965067295 +0000 UTC m=+284.964641187" observedRunningTime="2026-04-22 18:51:13.392615347 +0000 UTC m=+285.392189257" watchObservedRunningTime="2026-04-22 18:51:13.393775669 +0000 UTC m=+285.393349568" Apr 22 18:51:14.380158 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:14.380122 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9wfwv" event={"ID":"85daa685-3b8e-4641-b717-08df86db79f9","Type":"ContainerStarted","Data":"f48b86c7be9cd18210352bd623270256d6e2a871c8fb6a61b4d09ef66fb66b85"} Apr 22 18:51:14.381655 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:14.381633 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22c27" event={"ID":"efb38099-2266-40a5-ba8f-a7759b82543b","Type":"ContainerStarted","Data":"7e970f13fe91fb07ec115501796b1a82f729f7b7364385d76efb6a6e77ca8340"} Apr 22 18:51:14.381655 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:14.381658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-22c27" event={"ID":"efb38099-2266-40a5-ba8f-a7759b82543b","Type":"ContainerStarted","Data":"e7234728fc6cd208ccf656f576a9472890f8c95a74b4611dc818bcbe5bdf3cea"} Apr 22 18:51:14.399192 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:14.399146 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9wfwv" podStartSLOduration=251.33009766200001 podStartE2EDuration="4m13.399133704s" podCreationTimestamp="2026-04-22 18:47:01 +0000 UTC" firstStartedPulling="2026-04-22 18:51:11.80518571 +0000 UTC m=+283.804759589" lastFinishedPulling="2026-04-22 18:51:13.874221754 +0000 UTC m=+285.873795631" observedRunningTime="2026-04-22 18:51:14.397417588 +0000 UTC m=+286.396991508" watchObservedRunningTime="2026-04-22 18:51:14.399133704 +0000 UTC m=+286.398707600" Apr 22 18:51:14.425235 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:14.425179 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-22c27" podStartSLOduration=251.516058985 podStartE2EDuration="4m13.425164336s" podCreationTimestamp="2026-04-22 18:47:01 +0000 UTC" firstStartedPulling="2026-04-22 18:51:11.966496747 +0000 UTC m=+283.966070622" lastFinishedPulling="2026-04-22 18:51:13.875602094 +0000 UTC m=+285.875175973" observedRunningTime="2026-04-22 18:51:14.424486538 +0000 UTC m=+286.424060437" watchObservedRunningTime="2026-04-22 18:51:14.425164336 +0000 UTC m=+286.424738300" Apr 22 18:51:15.384343 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:15.384304 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-22c27" Apr 22 18:51:25.388964 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:51:25.388936 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-22c27" Apr 22 18:53:28.802701 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.802671 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-2jpp7"] Apr 22 18:53:28.804457 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.804441 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:28.807077 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.807056 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-xxtwn\"" Apr 22 18:53:28.807198 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.807059 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 22 18:53:28.807198 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.807095 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 18:53:28.808449 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.808434 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 18:53:28.814115 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.814092 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2jpp7"] Apr 22 18:53:28.927564 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.927534 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/acd885e0-7d33-41d6-adaf-66fb859e10b2-data\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:28.927725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:28.927587 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m8vq\" (UniqueName: \"kubernetes.io/projected/acd885e0-7d33-41d6-adaf-66fb859e10b2-kube-api-access-5m8vq\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.028728 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.028690 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/acd885e0-7d33-41d6-adaf-66fb859e10b2-data\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.028896 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.028738 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m8vq\" (UniqueName: \"kubernetes.io/projected/acd885e0-7d33-41d6-adaf-66fb859e10b2-kube-api-access-5m8vq\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.029055 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.029036 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/acd885e0-7d33-41d6-adaf-66fb859e10b2-data\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.037623 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.037601 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m8vq\" (UniqueName: \"kubernetes.io/projected/acd885e0-7d33-41d6-adaf-66fb859e10b2-kube-api-access-5m8vq\") pod \"seaweedfs-86cc847c5c-2jpp7\" (UID: \"acd885e0-7d33-41d6-adaf-66fb859e10b2\") " pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.113078 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.112994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:29.226519 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.226487 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-2jpp7"] Apr 22 18:53:29.228983 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:53:29.228957 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacd885e0_7d33_41d6_adaf_66fb859e10b2.slice/crio-12642d15da3160f0ee6a7e3e1307a1c463173227f121163aaf53abe0927e2da9 WatchSource:0}: Error finding container 12642d15da3160f0ee6a7e3e1307a1c463173227f121163aaf53abe0927e2da9: Status 404 returned error can't find the container with id 12642d15da3160f0ee6a7e3e1307a1c463173227f121163aaf53abe0927e2da9 Apr 22 18:53:29.230102 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.230087 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:53:29.727704 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:29.727666 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2jpp7" event={"ID":"acd885e0-7d33-41d6-adaf-66fb859e10b2","Type":"ContainerStarted","Data":"12642d15da3160f0ee6a7e3e1307a1c463173227f121163aaf53abe0927e2da9"} Apr 22 18:53:31.733929 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:31.733890 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-2jpp7" event={"ID":"acd885e0-7d33-41d6-adaf-66fb859e10b2","Type":"ContainerStarted","Data":"05de2e009ceeebf8d0a093bd9eaf88a2042f087d35d8a7b989894f7ab64e7da4"} Apr 22 18:53:31.734421 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:31.734118 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:53:31.751042 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:31.750978 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-2jpp7" podStartSLOduration=1.39663147 podStartE2EDuration="3.750960281s" podCreationTimestamp="2026-04-22 18:53:28 +0000 UTC" firstStartedPulling="2026-04-22 18:53:29.230209753 +0000 UTC m=+421.229783628" lastFinishedPulling="2026-04-22 18:53:31.584538549 +0000 UTC m=+423.584112439" observedRunningTime="2026-04-22 18:53:31.749819674 +0000 UTC m=+423.749393629" watchObservedRunningTime="2026-04-22 18:53:31.750960281 +0000 UTC m=+423.750534180" Apr 22 18:53:37.739215 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:53:37.739175 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-2jpp7" Apr 22 18:54:39.611140 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.611103 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-hlcqf"] Apr 22 18:54:39.612941 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.612926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.615922 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.615903 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 22 18:54:39.616017 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.615903 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-lj2nm\"" Apr 22 18:54:39.618029 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.618008 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-gcm2s"] Apr 22 18:54:39.620032 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.620013 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.622620 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.622606 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-79v2p\"" Apr 22 18:54:39.622891 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.622872 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 22 18:54:39.624801 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.624780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hlcqf"] Apr 22 18:54:39.628968 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.628949 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gcm2s"] Apr 22 18:54:39.703240 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.703204 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-cert\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.703240 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.703242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.703459 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.703335 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwq7k\" (UniqueName: \"kubernetes.io/projected/707021df-9a48-4588-a53e-c8ed64b47ad6-kube-api-access-xwq7k\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.703459 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.703378 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxm95\" (UniqueName: \"kubernetes.io/projected/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-kube-api-access-bxm95\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.804655 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.804616 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxm95\" (UniqueName: \"kubernetes.io/projected/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-kube-api-access-bxm95\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.804655 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.804657 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-cert\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.804853 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.804677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.804853 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:54:39.804765 2579 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 22 18:54:39.804853 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:54:39.804821 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs podName:707021df-9a48-4588-a53e-c8ed64b47ad6 nodeName:}" failed. No retries permitted until 2026-04-22 18:54:40.304805045 +0000 UTC m=+492.304378919 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs") pod "model-serving-api-86f7b4b499-hlcqf" (UID: "707021df-9a48-4588-a53e-c8ed64b47ad6") : secret "model-serving-api-tls" not found Apr 22 18:54:39.804968 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.804886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwq7k\" (UniqueName: \"kubernetes.io/projected/707021df-9a48-4588-a53e-c8ed64b47ad6-kube-api-access-xwq7k\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.807018 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.806996 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-cert\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.813934 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.813912 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxm95\" (UniqueName: \"kubernetes.io/projected/b913ff7b-2b42-4862-b0ed-77e64ba21f2d-kube-api-access-bxm95\") pod \"odh-model-controller-696fc77849-gcm2s\" (UID: \"b913ff7b-2b42-4862-b0ed-77e64ba21f2d\") " pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:39.816577 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.816556 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwq7k\" (UniqueName: \"kubernetes.io/projected/707021df-9a48-4588-a53e-c8ed64b47ad6-kube-api-access-xwq7k\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:39.932171 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:39.932150 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:40.050180 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.050146 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-gcm2s"] Apr 22 18:54:40.053381 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:54:40.053354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb913ff7b_2b42_4862_b0ed_77e64ba21f2d.slice/crio-95c392c172df081b41d8c18ee0527e49a3e6c8d022fa76ad16b9cdeb655c377c WatchSource:0}: Error finding container 95c392c172df081b41d8c18ee0527e49a3e6c8d022fa76ad16b9cdeb655c377c: Status 404 returned error can't find the container with id 95c392c172df081b41d8c18ee0527e49a3e6c8d022fa76ad16b9cdeb655c377c Apr 22 18:54:40.309037 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.308955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:40.311632 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.311603 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/707021df-9a48-4588-a53e-c8ed64b47ad6-tls-certs\") pod \"model-serving-api-86f7b4b499-hlcqf\" (UID: \"707021df-9a48-4588-a53e-c8ed64b47ad6\") " pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:40.522831 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.522794 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:40.668332 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.668298 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-hlcqf"] Apr 22 18:54:40.674608 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:54:40.674563 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod707021df_9a48_4588_a53e_c8ed64b47ad6.slice/crio-a5e99bba1a145ba999013eaa6fb1b6446650f436853e054e7d457bd724be60ea WatchSource:0}: Error finding container a5e99bba1a145ba999013eaa6fb1b6446650f436853e054e7d457bd724be60ea: Status 404 returned error can't find the container with id a5e99bba1a145ba999013eaa6fb1b6446650f436853e054e7d457bd724be60ea Apr 22 18:54:40.929761 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.929658 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hlcqf" event={"ID":"707021df-9a48-4588-a53e-c8ed64b47ad6","Type":"ContainerStarted","Data":"a5e99bba1a145ba999013eaa6fb1b6446650f436853e054e7d457bd724be60ea"} Apr 22 18:54:40.930988 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:40.930934 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gcm2s" event={"ID":"b913ff7b-2b42-4862-b0ed-77e64ba21f2d","Type":"ContainerStarted","Data":"95c392c172df081b41d8c18ee0527e49a3e6c8d022fa76ad16b9cdeb655c377c"} Apr 22 18:54:43.941711 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.941672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-hlcqf" event={"ID":"707021df-9a48-4588-a53e-c8ed64b47ad6","Type":"ContainerStarted","Data":"84a6d75f7279de6ba01033f53de4592fd7cf5701feb3b1165f089b79ea3e4c00"} Apr 22 18:54:43.942192 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.942009 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:43.943063 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.943025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-gcm2s" event={"ID":"b913ff7b-2b42-4862-b0ed-77e64ba21f2d","Type":"ContainerStarted","Data":"026593a70956e74fa68165094cb5caff4c63f9f56c8e25cbb16f8a66c5993bff"} Apr 22 18:54:43.943173 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.943159 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:43.960877 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.960820 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-hlcqf" podStartSLOduration=2.105766302 podStartE2EDuration="4.960803924s" podCreationTimestamp="2026-04-22 18:54:39 +0000 UTC" firstStartedPulling="2026-04-22 18:54:40.678612756 +0000 UTC m=+492.678186636" lastFinishedPulling="2026-04-22 18:54:43.533650383 +0000 UTC m=+495.533224258" observedRunningTime="2026-04-22 18:54:43.95978213 +0000 UTC m=+495.959356051" watchObservedRunningTime="2026-04-22 18:54:43.960803924 +0000 UTC m=+495.960377822" Apr 22 18:54:43.977958 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:43.977913 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-gcm2s" podStartSLOduration=1.5459132850000001 podStartE2EDuration="4.977900632s" podCreationTimestamp="2026-04-22 18:54:39 +0000 UTC" firstStartedPulling="2026-04-22 18:54:40.055006098 +0000 UTC m=+492.054579976" lastFinishedPulling="2026-04-22 18:54:43.486993445 +0000 UTC m=+495.486567323" observedRunningTime="2026-04-22 18:54:43.977103483 +0000 UTC m=+495.976677380" watchObservedRunningTime="2026-04-22 18:54:43.977900632 +0000 UTC m=+495.977474528" Apr 22 18:54:54.948230 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:54.948200 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-gcm2s" Apr 22 18:54:54.949943 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:54.949925 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-hlcqf" Apr 22 18:54:55.725994 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.725961 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-q8c7l"] Apr 22 18:54:55.728975 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.728956 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q8c7l" Apr 22 18:54:55.735486 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.735317 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-q8c7l"] Apr 22 18:54:55.826631 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.826601 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhqvw\" (UniqueName: \"kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw\") pod \"s3-init-q8c7l\" (UID: \"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e\") " pod="kserve/s3-init-q8c7l" Apr 22 18:54:55.927626 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.927595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhqvw\" (UniqueName: \"kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw\") pod \"s3-init-q8c7l\" (UID: \"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e\") " pod="kserve/s3-init-q8c7l" Apr 22 18:54:55.937356 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:55.937331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhqvw\" (UniqueName: \"kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw\") pod \"s3-init-q8c7l\" (UID: \"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e\") " pod="kserve/s3-init-q8c7l" Apr 22 18:54:56.037997 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:56.037920 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q8c7l" Apr 22 18:54:56.152009 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:56.151976 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-q8c7l"] Apr 22 18:54:56.155601 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:54:56.155575 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda97eb4f8_b9a7_4a20_9294_cfd77103ca1e.slice/crio-b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac WatchSource:0}: Error finding container b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac: Status 404 returned error can't find the container with id b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac Apr 22 18:54:56.977857 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:54:56.977814 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q8c7l" event={"ID":"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e","Type":"ContainerStarted","Data":"b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac"} Apr 22 18:55:00.990250 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:00.990208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q8c7l" event={"ID":"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e","Type":"ContainerStarted","Data":"e7b5efd483cdd9a57ccdc9a3941eaba4585569c5e6c76e0a696714956978e1dd"} Apr 22 18:55:01.009652 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:01.009593 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-q8c7l" podStartSLOduration=1.6827471379999999 podStartE2EDuration="6.009577828s" podCreationTimestamp="2026-04-22 18:54:55 +0000 UTC" firstStartedPulling="2026-04-22 18:54:56.157318479 +0000 UTC m=+508.156892354" lastFinishedPulling="2026-04-22 18:55:00.484149169 +0000 UTC m=+512.483723044" observedRunningTime="2026-04-22 18:55:01.008189473 +0000 UTC m=+513.007763367" watchObservedRunningTime="2026-04-22 18:55:01.009577828 +0000 UTC m=+513.009151725" Apr 22 18:55:04.000210 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:04.000173 2579 generic.go:358] "Generic (PLEG): container finished" podID="a97eb4f8-b9a7-4a20-9294-cfd77103ca1e" containerID="e7b5efd483cdd9a57ccdc9a3941eaba4585569c5e6c76e0a696714956978e1dd" exitCode=0 Apr 22 18:55:04.000688 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:04.000253 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q8c7l" event={"ID":"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e","Type":"ContainerDied","Data":"e7b5efd483cdd9a57ccdc9a3941eaba4585569c5e6c76e0a696714956978e1dd"} Apr 22 18:55:05.122555 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:05.122535 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q8c7l" Apr 22 18:55:05.202501 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:05.202466 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhqvw\" (UniqueName: \"kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw\") pod \"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e\" (UID: \"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e\") " Apr 22 18:55:05.204561 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:05.204538 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw" (OuterVolumeSpecName: "kube-api-access-dhqvw") pod "a97eb4f8-b9a7-4a20-9294-cfd77103ca1e" (UID: "a97eb4f8-b9a7-4a20-9294-cfd77103ca1e"). InnerVolumeSpecName "kube-api-access-dhqvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:05.303926 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:05.303846 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dhqvw\" (UniqueName: \"kubernetes.io/projected/a97eb4f8-b9a7-4a20-9294-cfd77103ca1e-kube-api-access-dhqvw\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:55:06.006560 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.006534 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q8c7l" Apr 22 18:55:06.006729 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.006536 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q8c7l" event={"ID":"a97eb4f8-b9a7-4a20-9294-cfd77103ca1e","Type":"ContainerDied","Data":"b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac"} Apr 22 18:55:06.006729 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.006636 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b185fe00e8ce3ce7bb8bf177824a4f862abc81e8719ca6b1fbae91438398f3ac" Apr 22 18:55:06.715652 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.715619 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:06.716004 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.715861 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a97eb4f8-b9a7-4a20-9294-cfd77103ca1e" containerName="s3-init" Apr 22 18:55:06.716004 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.715874 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a97eb4f8-b9a7-4a20-9294-cfd77103ca1e" containerName="s3-init" Apr 22 18:55:06.716004 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.715930 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a97eb4f8-b9a7-4a20-9294-cfd77103ca1e" containerName="s3-init" Apr 22 18:55:06.718861 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.718834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.721551 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.721528 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 18:55:06.724466 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.724432 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:06.814379 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.814340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kf4qj\" (UniqueName: \"kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.814534 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.814395 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.915500 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.915466 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kf4qj\" (UniqueName: \"kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.915662 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.915519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.915853 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.915838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:06.925150 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:06.925129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kf4qj\" (UniqueName: \"kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj\") pod \"seaweedfs-tls-custom-ddd4dbfd-5pjk6\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:07.028418 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:07.028331 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:07.140567 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:07.140538 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:07.142943 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:55:07.142915 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0adcdf_80f0_4fc7_8a2c_762efe50f909.slice/crio-d06dd0d11095402037b054bbc4b52aa5623c61f0d5127c2cc3f89cb82f366b5a WatchSource:0}: Error finding container d06dd0d11095402037b054bbc4b52aa5623c61f0d5127c2cc3f89cb82f366b5a: Status 404 returned error can't find the container with id d06dd0d11095402037b054bbc4b52aa5623c61f0d5127c2cc3f89cb82f366b5a Apr 22 18:55:08.020334 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:08.020300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" event={"ID":"9f0adcdf-80f0-4fc7-8a2c-762efe50f909","Type":"ContainerStarted","Data":"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43"} Apr 22 18:55:08.020334 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:08.020336 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" event={"ID":"9f0adcdf-80f0-4fc7-8a2c-762efe50f909","Type":"ContainerStarted","Data":"d06dd0d11095402037b054bbc4b52aa5623c61f0d5127c2cc3f89cb82f366b5a"} Apr 22 18:55:08.037779 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:08.037735 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" podStartSLOduration=1.777219615 podStartE2EDuration="2.037722644s" podCreationTimestamp="2026-04-22 18:55:06 +0000 UTC" firstStartedPulling="2026-04-22 18:55:07.144182877 +0000 UTC m=+519.143756754" lastFinishedPulling="2026-04-22 18:55:07.404685908 +0000 UTC m=+519.404259783" observedRunningTime="2026-04-22 18:55:08.036758727 +0000 UTC m=+520.036332624" watchObservedRunningTime="2026-04-22 18:55:08.037722644 +0000 UTC m=+520.037296538" Apr 22 18:55:08.752686 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:08.752653 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:10.025302 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:10.025216 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" podUID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" containerName="seaweedfs-tls-custom" containerID="cri-o://74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43" gracePeriod=30 Apr 22 18:55:37.947088 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:55:37.947054 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f0adcdf_80f0_4fc7_8a2c_762efe50f909.slice/crio-conmon-74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43.scope\": RecentStats: unable to find data in memory cache]" Apr 22 18:55:38.059858 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.059831 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:38.094139 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.094107 2579 generic.go:358] "Generic (PLEG): container finished" podID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" containerID="74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43" exitCode=0 Apr 22 18:55:38.094327 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.094172 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" Apr 22 18:55:38.094327 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.094179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" event={"ID":"9f0adcdf-80f0-4fc7-8a2c-762efe50f909","Type":"ContainerDied","Data":"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43"} Apr 22 18:55:38.094327 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.094206 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6" event={"ID":"9f0adcdf-80f0-4fc7-8a2c-762efe50f909","Type":"ContainerDied","Data":"d06dd0d11095402037b054bbc4b52aa5623c61f0d5127c2cc3f89cb82f366b5a"} Apr 22 18:55:38.094327 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.094223 2579 scope.go:117] "RemoveContainer" containerID="74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43" Apr 22 18:55:38.103245 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.103229 2579 scope.go:117] "RemoveContainer" containerID="74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43" Apr 22 18:55:38.103546 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:55:38.103518 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43\": container with ID starting with 74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43 not found: ID does not exist" containerID="74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43" Apr 22 18:55:38.103614 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.103560 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43"} err="failed to get container status \"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43\": rpc error: code = NotFound desc = could not find container \"74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43\": container with ID starting with 74d22943e4f3dfc90a9d45f0c765eb790cf37b3e42243f87302e65fb34366b43 not found: ID does not exist" Apr 22 18:55:38.144870 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.144846 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kf4qj\" (UniqueName: \"kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj\") pod \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " Apr 22 18:55:38.144969 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.144900 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data\") pod \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\" (UID: \"9f0adcdf-80f0-4fc7-8a2c-762efe50f909\") " Apr 22 18:55:38.146119 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.146096 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data" (OuterVolumeSpecName: "data") pod "9f0adcdf-80f0-4fc7-8a2c-762efe50f909" (UID: "9f0adcdf-80f0-4fc7-8a2c-762efe50f909"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:55:38.146735 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.146711 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj" (OuterVolumeSpecName: "kube-api-access-kf4qj") pod "9f0adcdf-80f0-4fc7-8a2c-762efe50f909" (UID: "9f0adcdf-80f0-4fc7-8a2c-762efe50f909"). InnerVolumeSpecName "kube-api-access-kf4qj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:55:38.245836 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.245807 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kf4qj\" (UniqueName: \"kubernetes.io/projected/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-kube-api-access-kf4qj\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:55:38.245836 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.245831 2579 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9f0adcdf-80f0-4fc7-8a2c-762efe50f909-data\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:55:38.418364 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.418330 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:38.422394 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.422372 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-5pjk6"] Apr 22 18:55:38.449586 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.449565 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c"] Apr 22 18:55:38.449810 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.449798 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" containerName="seaweedfs-tls-custom" Apr 22 18:55:38.449849 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.449812 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" containerName="seaweedfs-tls-custom" Apr 22 18:55:38.449882 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.449863 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" containerName="seaweedfs-tls-custom" Apr 22 18:55:38.453097 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.453082 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.455807 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.455776 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 22 18:55:38.455897 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.455819 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 22 18:55:38.459507 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.459485 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c"] Apr 22 18:55:38.546389 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.546319 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f0adcdf-80f0-4fc7-8a2c-762efe50f909" path="/var/lib/kubelet/pods/9f0adcdf-80f0-4fc7-8a2c-762efe50f909/volumes" Apr 22 18:55:38.548118 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.548101 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2a9e4317-9c35-43f1-9440-fdd65d6c362d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.548173 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.548134 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.548173 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.548154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sghwz\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-kube-api-access-sghwz\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.649402 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.649369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2a9e4317-9c35-43f1-9440-fdd65d6c362d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.649577 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.649422 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.649577 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.649455 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sghwz\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-kube-api-access-sghwz\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.649800 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.649776 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2a9e4317-9c35-43f1-9440-fdd65d6c362d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.651795 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.651777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.658382 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.658363 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sghwz\" (UniqueName: \"kubernetes.io/projected/2a9e4317-9c35-43f1-9440-fdd65d6c362d-kube-api-access-sghwz\") pod \"seaweedfs-tls-custom-5c88b85bb7-mlw7c\" (UID: \"2a9e4317-9c35-43f1-9440-fdd65d6c362d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.762918 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.762887 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" Apr 22 18:55:38.875559 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:38.875529 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c"] Apr 22 18:55:38.877730 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:55:38.877701 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9e4317_9c35_43f1_9440_fdd65d6c362d.slice/crio-bc9e80554f624ca299b02d18feb6a43a2211e0bb71e0ebe03eace4d4af91833e WatchSource:0}: Error finding container bc9e80554f624ca299b02d18feb6a43a2211e0bb71e0ebe03eace4d4af91833e: Status 404 returned error can't find the container with id bc9e80554f624ca299b02d18feb6a43a2211e0bb71e0ebe03eace4d4af91833e Apr 22 18:55:39.098212 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:39.098119 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" event={"ID":"2a9e4317-9c35-43f1-9440-fdd65d6c362d","Type":"ContainerStarted","Data":"bc9e80554f624ca299b02d18feb6a43a2211e0bb71e0ebe03eace4d4af91833e"} Apr 22 18:55:40.102014 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:40.101982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" event={"ID":"2a9e4317-9c35-43f1-9440-fdd65d6c362d","Type":"ContainerStarted","Data":"6bbf416c975750611b3afc04f9595a59bb25e3a3aee811f0830c7f07abf6cef0"} Apr 22 18:55:40.118628 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:40.118585 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-mlw7c" podStartSLOduration=1.879440106 podStartE2EDuration="2.118573786s" podCreationTimestamp="2026-04-22 18:55:38 +0000 UTC" firstStartedPulling="2026-04-22 18:55:38.878891825 +0000 UTC m=+550.878465699" lastFinishedPulling="2026-04-22 18:55:39.1180255 +0000 UTC m=+551.117599379" observedRunningTime="2026-04-22 18:55:40.117082015 +0000 UTC m=+552.116655933" watchObservedRunningTime="2026-04-22 18:55:40.118573786 +0000 UTC m=+552.118147682" Apr 22 18:55:54.234101 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.234064 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf"] Apr 22 18:55:54.236856 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.236839 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.239413 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.239396 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 22 18:55:54.239506 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.239399 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 22 18:55:54.244255 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.244229 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf"] Apr 22 18:55:54.361161 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.361132 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/86e3e93e-024b-4c69-a5bb-0091df5e9a51-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.361366 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.361186 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dssv\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-kube-api-access-5dssv\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.361366 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.361286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.462115 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.462081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dssv\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-kube-api-access-5dssv\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.462115 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.462118 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.462376 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.462143 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/86e3e93e-024b-4c69-a5bb-0091df5e9a51-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.462497 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.462473 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/86e3e93e-024b-4c69-a5bb-0091df5e9a51-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.464552 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.464528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.471319 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.471301 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dssv\" (UniqueName: \"kubernetes.io/projected/86e3e93e-024b-4c69-a5bb-0091df5e9a51-kube-api-access-5dssv\") pod \"seaweedfs-tls-serving-7fd5766db9-w8qpf\" (UID: \"86e3e93e-024b-4c69-a5bb-0091df5e9a51\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.545743 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.545675 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" Apr 22 18:55:54.658620 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:54.658594 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf"] Apr 22 18:55:54.661448 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:55:54.661424 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86e3e93e_024b_4c69_a5bb_0091df5e9a51.slice/crio-fb41056070e4adb64dd8e2c2ced1a40e0b3f6a1424723a361a9c7aa9b009e26d WatchSource:0}: Error finding container fb41056070e4adb64dd8e2c2ced1a40e0b3f6a1424723a361a9c7aa9b009e26d: Status 404 returned error can't find the container with id fb41056070e4adb64dd8e2c2ced1a40e0b3f6a1424723a361a9c7aa9b009e26d Apr 22 18:55:55.142128 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:55.142088 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" event={"ID":"86e3e93e-024b-4c69-a5bb-0091df5e9a51","Type":"ContainerStarted","Data":"bd14304e679a11574d0c73af3780e64fde4d626becf54362425e3a5c37da9567"} Apr 22 18:55:55.142128 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:55.142128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" event={"ID":"86e3e93e-024b-4c69-a5bb-0091df5e9a51","Type":"ContainerStarted","Data":"fb41056070e4adb64dd8e2c2ced1a40e0b3f6a1424723a361a9c7aa9b009e26d"} Apr 22 18:55:55.158850 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:55:55.158764 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w8qpf" podStartSLOduration=0.92228947 podStartE2EDuration="1.158751679s" podCreationTimestamp="2026-04-22 18:55:54 +0000 UTC" firstStartedPulling="2026-04-22 18:55:54.662819699 +0000 UTC m=+566.662393576" lastFinishedPulling="2026-04-22 18:55:54.899281902 +0000 UTC m=+566.898855785" observedRunningTime="2026-04-22 18:55:55.158204686 +0000 UTC m=+567.157778582" watchObservedRunningTime="2026-04-22 18:55:55.158751679 +0000 UTC m=+567.158325574" Apr 22 18:56:12.270451 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.270416 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:56:12.275070 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.275053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.277851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.277830 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 18:56:12.277953 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.277831 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 18:56:12.279128 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.279110 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bz9sf\"" Apr 22 18:56:12.279128 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.279123 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-predictor-serving-cert\"" Apr 22 18:56:12.279311 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.279140 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\"" Apr 22 18:56:12.284943 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.284918 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:56:12.398405 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.398367 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.398591 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.398417 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.398591 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.398441 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkvcs\" (UniqueName: \"kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.398591 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.398466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.499312 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.499247 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.499498 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.499395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkvcs\" (UniqueName: \"kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.499498 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.499436 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.499617 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.499527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.499843 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.499818 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.500032 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.500013 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.501831 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.501809 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.508519 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.508499 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkvcs\" (UniqueName: \"kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs\") pod \"isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.586568 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.586470 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:12.706652 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:12.706574 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:56:12.709279 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:56:12.709229 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b511405_63da_4bc9_860f_083b25a83eb8.slice/crio-78b2d0a3c9f96b376044c24d83c47a805ad85bf1b5cde5afb1f56ffaba0beac3 WatchSource:0}: Error finding container 78b2d0a3c9f96b376044c24d83c47a805ad85bf1b5cde5afb1f56ffaba0beac3: Status 404 returned error can't find the container with id 78b2d0a3c9f96b376044c24d83c47a805ad85bf1b5cde5afb1f56ffaba0beac3 Apr 22 18:56:13.191878 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:13.191843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerStarted","Data":"78b2d0a3c9f96b376044c24d83c47a805ad85bf1b5cde5afb1f56ffaba0beac3"} Apr 22 18:56:17.205918 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:17.205886 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerStarted","Data":"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a"} Apr 22 18:56:20.217337 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:20.217300 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b511405-63da-4bc9-860f-083b25a83eb8" containerID="fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a" exitCode=0 Apr 22 18:56:20.217790 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:20.217381 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerDied","Data":"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a"} Apr 22 18:56:34.265114 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:34.265072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerStarted","Data":"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53"} Apr 22 18:56:36.273085 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:36.273049 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerStarted","Data":"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140"} Apr 22 18:56:39.284274 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:39.284234 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerStarted","Data":"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650"} Apr 22 18:56:39.284712 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:39.284408 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:39.308817 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:39.308768 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podStartSLOduration=0.884464615 podStartE2EDuration="27.308751778s" podCreationTimestamp="2026-04-22 18:56:12 +0000 UTC" firstStartedPulling="2026-04-22 18:56:12.711069518 +0000 UTC m=+584.710643393" lastFinishedPulling="2026-04-22 18:56:39.135356678 +0000 UTC m=+611.134930556" observedRunningTime="2026-04-22 18:56:39.3062439 +0000 UTC m=+611.305817808" watchObservedRunningTime="2026-04-22 18:56:39.308751778 +0000 UTC m=+611.308325742" Apr 22 18:56:40.287045 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:40.287010 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:40.287045 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:40.287058 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:40.288648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:40.288602 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:56:40.289254 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:40.289228 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:40.291942 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:40.291922 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:56:41.289683 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:41.289645 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:56:41.290129 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:41.290034 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:42.292160 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:42.292124 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:56:42.292592 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:42.292352 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:56:52.292331 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:52.292222 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:56:52.292831 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:56:52.292724 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:02.293035 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:02.292995 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:57:02.293542 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:02.293513 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:12.292967 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:12.292929 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:57:12.293585 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:12.293428 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:22.292767 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:22.292717 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:57:22.293166 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:22.293114 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:32.292218 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:32.292170 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:57:32.292789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:32.292707 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:57:42.293337 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:42.293305 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:57:42.293734 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:42.293418 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:57:57.307783 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.307746 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:57:57.308240 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.308210 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" containerID="cri-o://1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53" gracePeriod=30 Apr 22 18:57:57.308347 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.308221 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" containerID="cri-o://e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650" gracePeriod=30 Apr 22 18:57:57.308503 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.308448 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" containerID="cri-o://9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140" gracePeriod=30 Apr 22 18:57:57.402491 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.402461 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:57:57.405390 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.405371 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.408091 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.408070 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-predictor-serving-cert\"" Apr 22 18:57:57.408210 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.408073 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\"" Apr 22 18:57:57.415819 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.415797 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:57:57.505835 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.505806 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b511405-63da-4bc9-860f-083b25a83eb8" containerID="9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140" exitCode=2 Apr 22 18:57:57.505997 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.505882 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerDied","Data":"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140"} Apr 22 18:57:57.516159 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.516140 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmz5\" (UniqueName: \"kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.516223 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.516178 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.516223 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.516198 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.516326 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.516235 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.616769 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.616680 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clmz5\" (UniqueName: \"kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.616769 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.616725 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.616769 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.616749 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.616769 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.616772 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.617146 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.617126 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.617440 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.617423 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.619152 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.619129 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.625490 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.625469 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmz5\" (UniqueName: \"kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5\") pod \"isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.717340 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.717292 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:57:57.837915 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:57.837883 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:57:57.841717 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:57:57.841692 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97cd9146_5575_42cd_8f0d_6609514a9dc1.slice/crio-71b11790321e9036712b6cb00acf9d4aaec13403a3316f4e3add8b29f7b0425e WatchSource:0}: Error finding container 71b11790321e9036712b6cb00acf9d4aaec13403a3316f4e3add8b29f7b0425e: Status 404 returned error can't find the container with id 71b11790321e9036712b6cb00acf9d4aaec13403a3316f4e3add8b29f7b0425e Apr 22 18:57:58.509719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:58.509687 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerStarted","Data":"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477"} Apr 22 18:57:58.509719 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:57:58.509723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerStarted","Data":"71b11790321e9036712b6cb00acf9d4aaec13403a3316f4e3add8b29f7b0425e"} Apr 22 18:58:00.288378 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:00.288334 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:01.520732 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:01.520698 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b511405-63da-4bc9-860f-083b25a83eb8" containerID="1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53" exitCode=0 Apr 22 18:58:01.521082 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:01.520751 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerDied","Data":"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53"} Apr 22 18:58:02.292099 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:02.292054 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:58:02.292356 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:02.292322 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:02.525273 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:02.525239 2579 generic.go:358] "Generic (PLEG): container finished" podID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerID="a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477" exitCode=0 Apr 22 18:58:02.525634 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:02.525315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerDied","Data":"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477"} Apr 22 18:58:03.531007 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.530976 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerStarted","Data":"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655"} Apr 22 18:58:03.531007 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.531014 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerStarted","Data":"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8"} Apr 22 18:58:03.531531 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.531025 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerStarted","Data":"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208"} Apr 22 18:58:03.531531 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.531402 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:58:03.531531 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.531445 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:58:03.531531 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.531457 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:58:03.533087 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.533052 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:03.533680 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.533650 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:03.552039 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:03.551997 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podStartSLOduration=6.551984696 podStartE2EDuration="6.551984696s" podCreationTimestamp="2026-04-22 18:57:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:58:03.551583499 +0000 UTC m=+695.551157396" watchObservedRunningTime="2026-04-22 18:58:03.551984696 +0000 UTC m=+695.551558593" Apr 22 18:58:04.540041 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:04.540001 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:04.540616 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:04.540310 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:05.288258 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:05.288218 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:09.544548 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:09.544520 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:58:09.545147 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:09.545125 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:09.545470 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:09.545449 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:10.288088 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:10.288048 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:10.288255 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:10.288200 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:58:12.292887 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:12.292843 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:58:12.293355 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:12.293222 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:15.287284 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:15.287229 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:19.545994 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:19.545957 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:19.546472 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:19.546392 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:20.287723 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:20.287682 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:22.292557 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:22.292511 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.21:8080: connect: connection refused" Apr 22 18:58:22.292978 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:22.292639 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:58:22.292978 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:22.292900 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:22.293049 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:22.292990 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:58:25.288109 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:25.288064 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.21:8643/healthz\": dial tcp 10.132.0.21:8643: connect: connection refused" Apr 22 18:58:27.453753 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.453731 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:58:27.535442 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535405 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls\") pod \"0b511405-63da-4bc9-860f-083b25a83eb8\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " Apr 22 18:58:27.535615 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535473 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkvcs\" (UniqueName: \"kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs\") pod \"0b511405-63da-4bc9-860f-083b25a83eb8\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " Apr 22 18:58:27.535615 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535570 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") pod \"0b511405-63da-4bc9-860f-083b25a83eb8\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " Apr 22 18:58:27.535615 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535605 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location\") pod \"0b511405-63da-4bc9-860f-083b25a83eb8\" (UID: \"0b511405-63da-4bc9-860f-083b25a83eb8\") " Apr 22 18:58:27.535949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535915 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b511405-63da-4bc9-860f-083b25a83eb8" (UID: "0b511405-63da-4bc9-860f-083b25a83eb8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:58:27.535949 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.535924 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-kube-rbac-proxy-sar-config") pod "0b511405-63da-4bc9-860f-083b25a83eb8" (UID: "0b511405-63da-4bc9-860f-083b25a83eb8"). InnerVolumeSpecName "isvc-sklearn-batcher-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:58:27.537793 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.537769 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b511405-63da-4bc9-860f-083b25a83eb8" (UID: "0b511405-63da-4bc9-860f-083b25a83eb8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:58:27.537866 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.537805 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs" (OuterVolumeSpecName: "kube-api-access-jkvcs") pod "0b511405-63da-4bc9-860f-083b25a83eb8" (UID: "0b511405-63da-4bc9-860f-083b25a83eb8"). InnerVolumeSpecName "kube-api-access-jkvcs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:58:27.602403 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.602305 2579 generic.go:358] "Generic (PLEG): container finished" podID="0b511405-63da-4bc9-860f-083b25a83eb8" containerID="e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650" exitCode=0 Apr 22 18:58:27.602403 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.602379 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerDied","Data":"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650"} Apr 22 18:58:27.602623 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.602431 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" event={"ID":"0b511405-63da-4bc9-860f-083b25a83eb8","Type":"ContainerDied","Data":"78b2d0a3c9f96b376044c24d83c47a805ad85bf1b5cde5afb1f56ffaba0beac3"} Apr 22 18:58:27.602623 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.602452 2579 scope.go:117] "RemoveContainer" containerID="e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650" Apr 22 18:58:27.602623 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.602392 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h" Apr 22 18:58:27.610547 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.610529 2579 scope.go:117] "RemoveContainer" containerID="9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140" Apr 22 18:58:27.617518 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.617499 2579 scope.go:117] "RemoveContainer" containerID="1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53" Apr 22 18:58:27.624361 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.624339 2579 scope.go:117] "RemoveContainer" containerID="fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a" Apr 22 18:58:27.626013 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.625989 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:58:27.630216 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.630194 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6c75bdff6f-tgw2h"] Apr 22 18:58:27.631927 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.631912 2579 scope.go:117] "RemoveContainer" containerID="e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650" Apr 22 18:58:27.632156 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:58:27.632140 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650\": container with ID starting with e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650 not found: ID does not exist" containerID="e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650" Apr 22 18:58:27.632199 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632163 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650"} err="failed to get container status \"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650\": rpc error: code = NotFound desc = could not find container \"e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650\": container with ID starting with e4db82fa010ffee03f7385c85c10c0cf14fbe9dae340a03344df42e723513650 not found: ID does not exist" Apr 22 18:58:27.632199 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632179 2579 scope.go:117] "RemoveContainer" containerID="9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140" Apr 22 18:58:27.632406 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:58:27.632389 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140\": container with ID starting with 9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140 not found: ID does not exist" containerID="9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140" Apr 22 18:58:27.632488 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632407 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140"} err="failed to get container status \"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140\": rpc error: code = NotFound desc = could not find container \"9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140\": container with ID starting with 9696252bbb65eeeda3f9390ae22ee310dda127343a3087cedeffafb49031b140 not found: ID does not exist" Apr 22 18:58:27.632488 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632420 2579 scope.go:117] "RemoveContainer" containerID="1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53" Apr 22 18:58:27.632595 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:58:27.632573 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53\": container with ID starting with 1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53 not found: ID does not exist" containerID="1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53" Apr 22 18:58:27.632648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632595 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53"} err="failed to get container status \"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53\": rpc error: code = NotFound desc = could not find container \"1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53\": container with ID starting with 1966c957d47deb488b0ca28b5d10a693701b6abe22bbeac4a461f8070f81cf53 not found: ID does not exist" Apr 22 18:58:27.632648 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632615 2579 scope.go:117] "RemoveContainer" containerID="fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a" Apr 22 18:58:27.632846 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:58:27.632828 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a\": container with ID starting with fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a not found: ID does not exist" containerID="fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a" Apr 22 18:58:27.632909 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.632851 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a"} err="failed to get container status \"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a\": rpc error: code = NotFound desc = could not find container \"fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a\": container with ID starting with fe2a4f738ba37588058c86f1365bb26ff2b13e90cc9a6d0a55d518739969161a not found: ID does not exist" Apr 22 18:58:27.636336 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.636317 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b511405-63da-4bc9-860f-083b25a83eb8-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:58:27.636336 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.636334 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jkvcs\" (UniqueName: \"kubernetes.io/projected/0b511405-63da-4bc9-860f-083b25a83eb8-kube-api-access-jkvcs\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:58:27.636483 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.636345 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0b511405-63da-4bc9-860f-083b25a83eb8-isvc-sklearn-batcher-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:58:27.636483 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:27.636360 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b511405-63da-4bc9-860f-083b25a83eb8-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:58:28.544013 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:28.543982 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" path="/var/lib/kubelet/pods/0b511405-63da-4bc9-860f-083b25a83eb8/volumes" Apr 22 18:58:29.545556 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:29.545516 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:29.545977 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:29.545952 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:39.545389 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:39.545344 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:39.545913 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:39.545861 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:49.545371 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:49.545323 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:49.545839 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:49.545693 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:58:59.545114 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:59.545069 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:58:59.545584 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:58:59.545531 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:09.546247 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:09.546217 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:09.546669 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:09.546417 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:22.502629 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.502591 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:59:22.503287 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.502961 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" containerID="cri-o://cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208" gracePeriod=30 Apr 22 18:59:22.503287 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.502985 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" containerID="cri-o://75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655" gracePeriod=30 Apr 22 18:59:22.503287 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.503089 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" containerID="cri-o://51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8" gracePeriod=30 Apr 22 18:59:22.545401 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545376 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 18:59:22.545641 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545630 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" Apr 22 18:59:22.545696 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545642 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" Apr 22 18:59:22.545696 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545653 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" Apr 22 18:59:22.545696 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545658 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" Apr 22 18:59:22.545696 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545691 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" Apr 22 18:59:22.545696 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545697 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" Apr 22 18:59:22.545889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545704 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="storage-initializer" Apr 22 18:59:22.545889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545709 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="storage-initializer" Apr 22 18:59:22.545889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545761 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="agent" Apr 22 18:59:22.545889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545769 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kserve-container" Apr 22 18:59:22.545889 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.545775 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b511405-63da-4bc9-860f-083b25a83eb8" containerName="kube-rbac-proxy" Apr 22 18:59:22.547542 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.547525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.550248 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.550228 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-predictor-serving-cert\"" Apr 22 18:59:22.550354 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.550237 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"message-dumper-kube-rbac-proxy-sar-config\"" Apr 22 18:59:22.558424 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.558405 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 18:59:22.652423 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.652393 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.652543 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.652430 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.652543 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.652456 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf98s\" (UniqueName: \"kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.753232 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.753153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.753232 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.753186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.753232 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.753213 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gf98s\" (UniqueName: \"kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.753473 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:22.753333 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/message-dumper-predictor-serving-cert: secret "message-dumper-predictor-serving-cert" not found Apr 22 18:59:22.753473 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:22.753412 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls podName:5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9 nodeName:}" failed. No retries permitted until 2026-04-22 18:59:23.253391371 +0000 UTC m=+775.252965251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls") pod "message-dumper-predictor-c7d86bcbd-6zbs8" (UID: "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9") : secret "message-dumper-predictor-serving-cert" not found Apr 22 18:59:22.753811 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.753793 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.762522 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.762498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf98s\" (UniqueName: \"kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:22.770043 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.770018 2579 generic.go:358] "Generic (PLEG): container finished" podID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerID="51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8" exitCode=2 Apr 22 18:59:22.770144 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:22.770052 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerDied","Data":"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8"} Apr 22 18:59:23.257743 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.257708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:23.260053 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.260022 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") pod \"message-dumper-predictor-c7d86bcbd-6zbs8\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:23.457821 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.457780 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:23.576812 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.576574 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 18:59:23.579310 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:59:23.579257 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8200e0_1c22_4bc8_bc1c_74b966e8f9e9.slice/crio-abcbc5b38b484a1f8e91d9e8d4976934ca9154a094b298055aaa5a6e83f4cabb WatchSource:0}: Error finding container abcbc5b38b484a1f8e91d9e8d4976934ca9154a094b298055aaa5a6e83f4cabb: Status 404 returned error can't find the container with id abcbc5b38b484a1f8e91d9e8d4976934ca9154a094b298055aaa5a6e83f4cabb Apr 22 18:59:23.581086 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.581069 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 18:59:23.773697 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:23.773665 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerStarted","Data":"abcbc5b38b484a1f8e91d9e8d4976934ca9154a094b298055aaa5a6e83f4cabb"} Apr 22 18:59:24.540194 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:24.540164 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:24.777908 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:24.777828 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerStarted","Data":"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e"} Apr 22 18:59:24.777908 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:24.777862 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerStarted","Data":"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f"} Apr 22 18:59:24.778327 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:24.777980 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:24.796866 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:24.796821 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" podStartSLOduration=1.898066993 podStartE2EDuration="2.796804871s" podCreationTimestamp="2026-04-22 18:59:22 +0000 UTC" firstStartedPulling="2026-04-22 18:59:23.581198387 +0000 UTC m=+775.580772262" lastFinishedPulling="2026-04-22 18:59:24.479936256 +0000 UTC m=+776.479510140" observedRunningTime="2026-04-22 18:59:24.794641692 +0000 UTC m=+776.794215593" watchObservedRunningTime="2026-04-22 18:59:24.796804871 +0000 UTC m=+776.796378768" Apr 22 18:59:25.780363 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:25.780334 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:25.782010 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:25.781986 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:26.784745 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:26.784711 2579 generic.go:358] "Generic (PLEG): container finished" podID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerID="cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208" exitCode=0 Apr 22 18:59:26.785086 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:26.784773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerDied","Data":"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208"} Apr 22 18:59:29.540653 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:29.540616 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:29.546008 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:29.545979 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:59:29.546216 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:29.546195 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:32.792238 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:32.792210 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 18:59:34.540678 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:34.540632 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:34.543657 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:34.543635 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:39.540308 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:39.540247 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:39.545636 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:39.545609 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:59:39.545974 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:39.545946 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:42.585115 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.585085 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 18:59:42.587865 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.587843 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.590748 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.590731 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-predictor-serving-cert\"" Apr 22 18:59:42.590851 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.590778 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-logger-kube-rbac-proxy-sar-config\"" Apr 22 18:59:42.598448 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.598427 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 18:59:42.710571 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.710511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.710742 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.710595 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl28n\" (UniqueName: \"kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.710742 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.710622 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.710742 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.710662 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811147 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl28n\" (UniqueName: \"kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811354 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811163 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811354 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811354 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811252 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811520 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:42.811353 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-logger-predictor-serving-cert: secret "isvc-logger-predictor-serving-cert" not found Apr 22 18:59:42.811520 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:42.811413 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls podName:217fb72f-0bc2-4512-b29b-42d704a0f1cc nodeName:}" failed. No retries permitted until 2026-04-22 18:59:43.311397344 +0000 UTC m=+795.310971219 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls") pod "isvc-logger-predictor-64d54fcc88-4jfbp" (UID: "217fb72f-0bc2-4512-b29b-42d704a0f1cc") : secret "isvc-logger-predictor-serving-cert" not found Apr 22 18:59:42.811697 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.811943 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.811925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:42.822288 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:42.822250 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl28n\" (UniqueName: \"kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:43.317012 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.316972 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:43.319397 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.319377 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") pod \"isvc-logger-predictor-64d54fcc88-4jfbp\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:43.497882 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.497847 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:43.618825 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.618780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 18:59:43.620472 ip-10-0-137-19 kubenswrapper[2579]: W0422 18:59:43.620432 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod217fb72f_0bc2_4512_b29b_42d704a0f1cc.slice/crio-18e4bc1f82048cb87d689dc31479d6bb6cb84ee281a729cb23e4a6f3d43abc7e WatchSource:0}: Error finding container 18e4bc1f82048cb87d689dc31479d6bb6cb84ee281a729cb23e4a6f3d43abc7e: Status 404 returned error can't find the container with id 18e4bc1f82048cb87d689dc31479d6bb6cb84ee281a729cb23e4a6f3d43abc7e Apr 22 18:59:43.836674 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.836589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerStarted","Data":"32c7e3ab5fa72e35e3b4c048ae7cd3539b8330df208a7a48401415bf3c062aa0"} Apr 22 18:59:43.836674 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:43.836629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerStarted","Data":"18e4bc1f82048cb87d689dc31479d6bb6cb84ee281a729cb23e4a6f3d43abc7e"} Apr 22 18:59:44.540475 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:44.540432 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:47.848795 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:47.848706 2579 generic.go:358] "Generic (PLEG): container finished" podID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerID="32c7e3ab5fa72e35e3b4c048ae7cd3539b8330df208a7a48401415bf3c062aa0" exitCode=0 Apr 22 18:59:47.848795 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:47.848753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerDied","Data":"32c7e3ab5fa72e35e3b4c048ae7cd3539b8330df208a7a48401415bf3c062aa0"} Apr 22 18:59:48.853153 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:48.853115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerStarted","Data":"eceb87cdaabd96f66a2882d799ddf69a01a97bd654b983a0f59b81f272f46797"} Apr 22 18:59:48.853624 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:48.853160 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerStarted","Data":"1d67c261dad056e0a44e760d118a1b10039eff4262be7b7126e426db4f7d9aa0"} Apr 22 18:59:48.853624 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:48.853174 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerStarted","Data":"4fad33edf7266a41e6d9ad331b9b30e3e541e47381c2bb4284d9e334840604a9"} Apr 22 18:59:48.853624 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:48.853452 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:48.893470 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:48.893416 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podStartSLOduration=6.893403178 podStartE2EDuration="6.893403178s" podCreationTimestamp="2026-04-22 18:59:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 18:59:48.892426396 +0000 UTC m=+800.892000293" watchObservedRunningTime="2026-04-22 18:59:48.893403178 +0000 UTC m=+800.892977139" Apr 22 18:59:49.540779 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.540736 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.22:8643/healthz\": dial tcp 10.132.0.22:8643: connect: connection refused" Apr 22 18:59:49.545097 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.545071 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.22:5000: connect: connection refused" Apr 22 18:59:49.545191 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.545183 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:49.545429 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.545403 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:49.545529 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.545515 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:49.855826 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.855740 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:49.855826 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.855775 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:49.857213 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.857177 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 18:59:49.857813 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:49.857790 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:50.858744 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:50.858706 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 18:59:50.859138 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:50.859098 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 18:59:52.640285 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.640248 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:52.686725 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.686683 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") pod \"97cd9146-5575-42cd-8f0d-6609514a9dc1\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " Apr 22 18:59:52.686885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.686746 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clmz5\" (UniqueName: \"kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5\") pod \"97cd9146-5575-42cd-8f0d-6609514a9dc1\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " Apr 22 18:59:52.686885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.686783 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location\") pod \"97cd9146-5575-42cd-8f0d-6609514a9dc1\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " Apr 22 18:59:52.686885 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.686833 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls\") pod \"97cd9146-5575-42cd-8f0d-6609514a9dc1\" (UID: \"97cd9146-5575-42cd-8f0d-6609514a9dc1\") " Apr 22 18:59:52.687118 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.687094 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "97cd9146-5575-42cd-8f0d-6609514a9dc1" (UID: "97cd9146-5575-42cd-8f0d-6609514a9dc1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 18:59:52.687165 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.687099 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config") pod "97cd9146-5575-42cd-8f0d-6609514a9dc1" (UID: "97cd9146-5575-42cd-8f0d-6609514a9dc1"). InnerVolumeSpecName "isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 18:59:52.688789 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.688767 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5" (OuterVolumeSpecName: "kube-api-access-clmz5") pod "97cd9146-5575-42cd-8f0d-6609514a9dc1" (UID: "97cd9146-5575-42cd-8f0d-6609514a9dc1"). InnerVolumeSpecName "kube-api-access-clmz5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 18:59:52.688879 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.688824 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "97cd9146-5575-42cd-8f0d-6609514a9dc1" (UID: "97cd9146-5575-42cd-8f0d-6609514a9dc1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 18:59:52.787773 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.787706 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/97cd9146-5575-42cd-8f0d-6609514a9dc1-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:59:52.787773 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.787730 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/97cd9146-5575-42cd-8f0d-6609514a9dc1-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:59:52.787773 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.787742 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/97cd9146-5575-42cd-8f0d-6609514a9dc1-isvc-sklearn-batcher-custom-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:59:52.787773 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.787751 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-clmz5\" (UniqueName: \"kubernetes.io/projected/97cd9146-5575-42cd-8f0d-6609514a9dc1-kube-api-access-clmz5\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 18:59:52.865207 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.865172 2579 generic.go:358] "Generic (PLEG): container finished" podID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerID="75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655" exitCode=0 Apr 22 18:59:52.865372 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.865215 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerDied","Data":"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655"} Apr 22 18:59:52.865372 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.865241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" event={"ID":"97cd9146-5575-42cd-8f0d-6609514a9dc1","Type":"ContainerDied","Data":"71b11790321e9036712b6cb00acf9d4aaec13403a3316f4e3add8b29f7b0425e"} Apr 22 18:59:52.865372 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.865277 2579 scope.go:117] "RemoveContainer" containerID="75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655" Apr 22 18:59:52.865372 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.865281 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf" Apr 22 18:59:52.873140 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.873119 2579 scope.go:117] "RemoveContainer" containerID="51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8" Apr 22 18:59:52.880303 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.880287 2579 scope.go:117] "RemoveContainer" containerID="cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208" Apr 22 18:59:52.886768 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.886749 2579 scope.go:117] "RemoveContainer" containerID="a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477" Apr 22 18:59:52.889246 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.889226 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:59:52.893455 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.893432 2579 scope.go:117] "RemoveContainer" containerID="75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655" Apr 22 18:59:52.893933 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:52.893803 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655\": container with ID starting with 75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655 not found: ID does not exist" containerID="75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655" Apr 22 18:59:52.893933 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.893837 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655"} err="failed to get container status \"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655\": rpc error: code = NotFound desc = could not find container \"75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655\": container with ID starting with 75d5c4b7acf6d1e8d001f8c66b556d0e5e63bf3af449cfb35e85b67af8f1d655 not found: ID does not exist" Apr 22 18:59:52.893933 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.893860 2579 scope.go:117] "RemoveContainer" containerID="51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8" Apr 22 18:59:52.894229 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:52.894211 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8\": container with ID starting with 51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8 not found: ID does not exist" containerID="51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8" Apr 22 18:59:52.894310 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.894237 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8"} err="failed to get container status \"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8\": rpc error: code = NotFound desc = could not find container \"51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8\": container with ID starting with 51a36ba637b5be674bda098f483c815ff52dbe6da3b4af3b6a941c199bcf68a8 not found: ID does not exist" Apr 22 18:59:52.894383 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.894366 2579 scope.go:117] "RemoveContainer" containerID="cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208" Apr 22 18:59:52.894684 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:52.894666 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208\": container with ID starting with cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208 not found: ID does not exist" containerID="cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208" Apr 22 18:59:52.894761 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.894691 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208"} err="failed to get container status \"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208\": rpc error: code = NotFound desc = could not find container \"cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208\": container with ID starting with cd19d894017450ae20f02932e6bdb3c5c61c47ef60441c03335903340bab3208 not found: ID does not exist" Apr 22 18:59:52.894761 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.894714 2579 scope.go:117] "RemoveContainer" containerID="a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477" Apr 22 18:59:52.894976 ip-10-0-137-19 kubenswrapper[2579]: E0422 18:59:52.894959 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477\": container with ID starting with a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477 not found: ID does not exist" containerID="a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477" Apr 22 18:59:52.895050 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.894985 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477"} err="failed to get container status \"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477\": rpc error: code = NotFound desc = could not find container \"a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477\": container with ID starting with a687c9829aab3d8922c6d6d6138c853519cee3e04e09a97e90ad69354613d477 not found: ID does not exist" Apr 22 18:59:52.895721 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:52.895702 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-ccbd696dd-qh8nf"] Apr 22 18:59:54.543493 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:54.543459 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" path="/var/lib/kubelet/pods/97cd9146-5575-42cd-8f0d-6609514a9dc1/volumes" Apr 22 18:59:55.863986 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:55.863951 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 18:59:55.864674 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:55.864630 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 18:59:55.865141 ip-10-0-137-19 kubenswrapper[2579]: I0422 18:59:55.865121 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:05.865199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:05.865156 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:00:05.865684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:05.865659 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:15.865337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:15.865295 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:00:15.865816 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:15.865793 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:25.865453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:25.865405 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:00:25.865950 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:25.865846 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:35.864955 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:35.864907 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:00:35.865365 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:35.865325 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:45.865193 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:45.865148 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:00:45.865680 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:45.865603 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:00:55.865449 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:55.865418 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:00:55.865933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:00:55.865486 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:07.629186 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.629158 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-c7d86bcbd-6zbs8_5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9/kserve-container/0.log" Apr 22 19:01:07.776649 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.776616 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 19:01:07.776965 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.776933 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kserve-container" containerID="cri-o://33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" gracePeriod=30 Apr 22 19:01:07.777136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.777101 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kube-rbac-proxy" containerID="cri-o://cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" gracePeriod=30 Apr 22 19:01:07.788897 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.788851 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.23:8643/healthz\": dial tcp 10.132.0.23:8643: connect: connection refused" Apr 22 19:01:07.835687 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.835653 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 19:01:07.836058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.836034 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" containerID="cri-o://4fad33edf7266a41e6d9ad331b9b30e3e541e47381c2bb4284d9e334840604a9" gracePeriod=30 Apr 22 19:01:07.836186 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.836128 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" containerID="cri-o://1d67c261dad056e0a44e760d118a1b10039eff4262be7b7126e426db4f7d9aa0" gracePeriod=30 Apr 22 19:01:07.836186 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.836154 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" containerID="cri-o://eceb87cdaabd96f66a2882d799ddf69a01a97bd654b983a0f59b81f272f46797" gracePeriod=30 Apr 22 19:01:07.862364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862338 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:01:07.862691 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862677 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" Apr 22 19:01:07.862732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862694 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" Apr 22 19:01:07.862732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862702 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" Apr 22 19:01:07.862732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862708 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" Apr 22 19:01:07.862732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862721 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="storage-initializer" Apr 22 19:01:07.862732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862727 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="storage-initializer" Apr 22 19:01:07.862884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862739 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" Apr 22 19:01:07.862884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862747 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" Apr 22 19:01:07.862884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862798 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="agent" Apr 22 19:01:07.862884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862805 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kserve-container" Apr 22 19:01:07.862884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.862816 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="97cd9146-5575-42cd-8f0d-6609514a9dc1" containerName="kube-rbac-proxy" Apr 22 19:01:07.865893 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.865879 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:07.868356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.868329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-predictor-serving-cert\"" Apr 22 19:01:07.868356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.868346 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-kube-rbac-proxy-sar-config\"" Apr 22 19:01:07.874759 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.874740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:01:07.964140 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.964113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55q89\" (UniqueName: \"kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:07.964286 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.964167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:07.964286 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.964203 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:07.964364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:07.964310 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.065670 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.065642 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-55q89\" (UniqueName: \"kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.065782 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.065700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.065782 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.065727 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.065782 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.065748 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.065998 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:01:08.065968 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-predictor-serving-cert: secret "isvc-lightgbm-predictor-serving-cert" not found Apr 22 19:01:08.066122 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:01:08.066039 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls podName:07d85718-3e0b-4f3b-b3af-da286afb0dde nodeName:}" failed. No retries permitted until 2026-04-22 19:01:08.566019471 +0000 UTC m=+880.565593349 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls") pod "isvc-lightgbm-predictor-bdf964bd-n5g9r" (UID: "07d85718-3e0b-4f3b-b3af-da286afb0dde") : secret "isvc-lightgbm-predictor-serving-cert" not found Apr 22 19:01:08.066225 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.066202 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.066517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.066501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.071071 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.071054 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 19:01:08.076398 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.076002 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-55q89\" (UniqueName: \"kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.080704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080681 2579 generic.go:358] "Generic (PLEG): container finished" podID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerID="cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" exitCode=2 Apr 22 19:01:08.080704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080702 2579 generic.go:358] "Generic (PLEG): container finished" podID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerID="33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" exitCode=2 Apr 22 19:01:08.080860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080754 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerDied","Data":"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e"} Apr 22 19:01:08.080860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080788 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerDied","Data":"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f"} Apr 22 19:01:08.080860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080791 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" Apr 22 19:01:08.080860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080801 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8" event={"ID":"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9","Type":"ContainerDied","Data":"abcbc5b38b484a1f8e91d9e8d4976934ca9154a094b298055aaa5a6e83f4cabb"} Apr 22 19:01:08.080860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.080818 2579 scope.go:117] "RemoveContainer" containerID="cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" Apr 22 19:01:08.083100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.083081 2579 generic.go:358] "Generic (PLEG): container finished" podID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerID="1d67c261dad056e0a44e760d118a1b10039eff4262be7b7126e426db4f7d9aa0" exitCode=2 Apr 22 19:01:08.083172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.083108 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerDied","Data":"1d67c261dad056e0a44e760d118a1b10039eff4262be7b7126e426db4f7d9aa0"} Apr 22 19:01:08.088369 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.088353 2579 scope.go:117] "RemoveContainer" containerID="33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" Apr 22 19:01:08.097027 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097003 2579 scope.go:117] "RemoveContainer" containerID="cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" Apr 22 19:01:08.097331 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:01:08.097313 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e\": container with ID starting with cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e not found: ID does not exist" containerID="cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" Apr 22 19:01:08.097407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097339 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e"} err="failed to get container status \"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e\": rpc error: code = NotFound desc = could not find container \"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e\": container with ID starting with cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e not found: ID does not exist" Apr 22 19:01:08.097407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097356 2579 scope.go:117] "RemoveContainer" containerID="33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" Apr 22 19:01:08.097603 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:01:08.097587 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f\": container with ID starting with 33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f not found: ID does not exist" containerID="33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" Apr 22 19:01:08.097647 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097609 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f"} err="failed to get container status \"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f\": rpc error: code = NotFound desc = could not find container \"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f\": container with ID starting with 33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f not found: ID does not exist" Apr 22 19:01:08.097647 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097627 2579 scope.go:117] "RemoveContainer" containerID="cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e" Apr 22 19:01:08.097873 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097854 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e"} err="failed to get container status \"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e\": rpc error: code = NotFound desc = could not find container \"cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e\": container with ID starting with cca8fff8bc343fc282bc048fa97f0161af9cc0f3622d33be7c9b42fb0b56e02e not found: ID does not exist" Apr 22 19:01:08.097919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.097873 2579 scope.go:117] "RemoveContainer" containerID="33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f" Apr 22 19:01:08.098087 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.098059 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f"} err="failed to get container status \"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f\": rpc error: code = NotFound desc = could not find container \"33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f\": container with ID starting with 33cd1ca754837bf5f7d00e26ff39346a05c278e31a4ce4e3b621147c35b45b6f not found: ID does not exist" Apr 22 19:01:08.166146 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.166104 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config\") pod \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " Apr 22 19:01:08.166146 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.166150 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf98s\" (UniqueName: \"kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s\") pod \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " Apr 22 19:01:08.166389 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.166197 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") pod \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\" (UID: \"5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9\") " Apr 22 19:01:08.166525 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.166499 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "message-dumper-kube-rbac-proxy-sar-config") pod "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" (UID: "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9"). InnerVolumeSpecName "message-dumper-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:01:08.168343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.168321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s" (OuterVolumeSpecName: "kube-api-access-gf98s") pod "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" (UID: "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9"). InnerVolumeSpecName "kube-api-access-gf98s". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:01:08.168343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.168327 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" (UID: "5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:01:08.267689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.267602 2579 reconciler_common.go:299] "Volume detached for volume \"message-dumper-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-message-dumper-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:08.267689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.267632 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gf98s\" (UniqueName: \"kubernetes.io/projected/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-kube-api-access-gf98s\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:08.267689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.267642 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:08.405390 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.405357 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 19:01:08.409772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.409746 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-c7d86bcbd-6zbs8"] Apr 22 19:01:08.543598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.543518 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" path="/var/lib/kubelet/pods/5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9/volumes" Apr 22 19:01:08.571338 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.571310 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.573656 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.573637 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") pod \"isvc-lightgbm-predictor-bdf964bd-n5g9r\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.776139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.776107 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:08.894594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:08.894573 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:01:08.897030 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:01:08.896999 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07d85718_3e0b_4f3b_b3af_da286afb0dde.slice/crio-2ab1abbb6306e176085d5970641a58dc76cacbb12e9e2fe7a509c1968aedd517 WatchSource:0}: Error finding container 2ab1abbb6306e176085d5970641a58dc76cacbb12e9e2fe7a509c1968aedd517: Status 404 returned error can't find the container with id 2ab1abbb6306e176085d5970641a58dc76cacbb12e9e2fe7a509c1968aedd517 Apr 22 19:01:09.088094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:09.088007 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerStarted","Data":"0fafe683a8a0814b28b642ae93184bfc6a06fd348464f26d5c15475ce0ea55af"} Apr 22 19:01:09.088094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:09.088053 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerStarted","Data":"2ab1abbb6306e176085d5970641a58dc76cacbb12e9e2fe7a509c1968aedd517"} Apr 22 19:01:10.859285 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:10.859226 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:12.098521 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:12.098438 2579 generic.go:358] "Generic (PLEG): container finished" podID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerID="4fad33edf7266a41e6d9ad331b9b30e3e541e47381c2bb4284d9e334840604a9" exitCode=0 Apr 22 19:01:12.098521 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:12.098457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerDied","Data":"4fad33edf7266a41e6d9ad331b9b30e3e541e47381c2bb4284d9e334840604a9"} Apr 22 19:01:13.102781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:13.102695 2579 generic.go:358] "Generic (PLEG): container finished" podID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerID="0fafe683a8a0814b28b642ae93184bfc6a06fd348464f26d5c15475ce0ea55af" exitCode=0 Apr 22 19:01:13.102781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:13.102736 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerDied","Data":"0fafe683a8a0814b28b642ae93184bfc6a06fd348464f26d5c15475ce0ea55af"} Apr 22 19:01:15.860052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:15.859902 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:15.864697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:15.864579 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:01:15.865038 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:15.864972 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:20.859138 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:20.859091 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:20.859582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:20.859292 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:23.136016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.135979 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerStarted","Data":"e00b9f02b0f5ef57dcd64ce34057f8b259bc5a67509abc17cee3c012925dbd02"} Apr 22 19:01:23.136016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.136022 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerStarted","Data":"229e3568120aa8fa8e34371e6476cbbb540dcba31b8df260ddf224fcbf6ae6e5"} Apr 22 19:01:23.136444 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.136361 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:23.136515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.136497 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:23.137702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.137679 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:01:23.157439 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:23.157366 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podStartSLOduration=7.182104973 podStartE2EDuration="16.157349589s" podCreationTimestamp="2026-04-22 19:01:07 +0000 UTC" firstStartedPulling="2026-04-22 19:01:13.103977342 +0000 UTC m=+885.103551220" lastFinishedPulling="2026-04-22 19:01:22.079221955 +0000 UTC m=+894.078795836" observedRunningTime="2026-04-22 19:01:23.15624492 +0000 UTC m=+895.155818828" watchObservedRunningTime="2026-04-22 19:01:23.157349589 +0000 UTC m=+895.156923487" Apr 22 19:01:24.139657 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:24.139614 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:01:25.141955 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:25.141910 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:01:25.859454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:25.859413 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:25.864806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:25.864779 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:01:25.865112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:25.865087 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:30.147513 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:30.147482 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:01:30.148021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:30.147996 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:01:30.859744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:30.859653 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:35.858994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:35.858953 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.24:8643/healthz\": dial tcp 10.132.0.24:8643: connect: connection refused" Apr 22 19:01:35.865411 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:35.865371 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.24:8080: connect: connection refused" Apr 22 19:01:35.865547 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:35.865511 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:35.865753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:35.865730 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 19:01:35.865858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:35.865808 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:38.182123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.182093 2579 generic.go:358] "Generic (PLEG): container finished" podID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerID="eceb87cdaabd96f66a2882d799ddf69a01a97bd654b983a0f59b81f272f46797" exitCode=0 Apr 22 19:01:38.182494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.182129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerDied","Data":"eceb87cdaabd96f66a2882d799ddf69a01a97bd654b983a0f59b81f272f46797"} Apr 22 19:01:38.473072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.473045 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:38.614070 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614037 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl28n\" (UniqueName: \"kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n\") pod \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " Apr 22 19:01:38.614248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614096 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location\") pod \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " Apr 22 19:01:38.614248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614135 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") pod \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " Apr 22 19:01:38.614248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614175 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config\") pod \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\" (UID: \"217fb72f-0bc2-4512-b29b-42d704a0f1cc\") " Apr 22 19:01:38.614535 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614501 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "217fb72f-0bc2-4512-b29b-42d704a0f1cc" (UID: "217fb72f-0bc2-4512-b29b-42d704a0f1cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:01:38.614638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.614555 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-logger-kube-rbac-proxy-sar-config") pod "217fb72f-0bc2-4512-b29b-42d704a0f1cc" (UID: "217fb72f-0bc2-4512-b29b-42d704a0f1cc"). InnerVolumeSpecName "isvc-logger-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:01:38.616172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.616151 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "217fb72f-0bc2-4512-b29b-42d704a0f1cc" (UID: "217fb72f-0bc2-4512-b29b-42d704a0f1cc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:01:38.616335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.616318 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n" (OuterVolumeSpecName: "kube-api-access-gl28n") pod "217fb72f-0bc2-4512-b29b-42d704a0f1cc" (UID: "217fb72f-0bc2-4512-b29b-42d704a0f1cc"). InnerVolumeSpecName "kube-api-access-gl28n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:01:38.715380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.715306 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:38.715380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.715333 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/217fb72f-0bc2-4512-b29b-42d704a0f1cc-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:38.715380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.715352 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-logger-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/217fb72f-0bc2-4512-b29b-42d704a0f1cc-isvc-logger-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:38.715380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:38.715363 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gl28n\" (UniqueName: \"kubernetes.io/projected/217fb72f-0bc2-4512-b29b-42d704a0f1cc-kube-api-access-gl28n\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:01:39.190708 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.190672 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" event={"ID":"217fb72f-0bc2-4512-b29b-42d704a0f1cc","Type":"ContainerDied","Data":"18e4bc1f82048cb87d689dc31479d6bb6cb84ee281a729cb23e4a6f3d43abc7e"} Apr 22 19:01:39.191112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.190719 2579 scope.go:117] "RemoveContainer" containerID="eceb87cdaabd96f66a2882d799ddf69a01a97bd654b983a0f59b81f272f46797" Apr 22 19:01:39.191112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.190736 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp" Apr 22 19:01:39.198742 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.198723 2579 scope.go:117] "RemoveContainer" containerID="1d67c261dad056e0a44e760d118a1b10039eff4262be7b7126e426db4f7d9aa0" Apr 22 19:01:39.205391 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.205376 2579 scope.go:117] "RemoveContainer" containerID="4fad33edf7266a41e6d9ad331b9b30e3e541e47381c2bb4284d9e334840604a9" Apr 22 19:01:39.211864 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.211847 2579 scope.go:117] "RemoveContainer" containerID="32c7e3ab5fa72e35e3b4c048ae7cd3539b8330df208a7a48401415bf3c062aa0" Apr 22 19:01:39.216519 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.216469 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 19:01:39.218504 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:39.218482 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-64d54fcc88-4jfbp"] Apr 22 19:01:40.148460 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:40.148422 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:01:40.544077 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:40.544033 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" path="/var/lib/kubelet/pods/217fb72f-0bc2-4512-b29b-42d704a0f1cc/volumes" Apr 22 19:01:50.148373 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:01:50.148329 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:00.148161 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:00.148118 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:10.148392 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:10.148352 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:20.148530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:20.148494 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:30.148031 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:30.147990 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:40.149181 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:40.149147 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:02:47.987365 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:47.987255 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:02:47.987750 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:47.987600 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" containerID="cri-o://229e3568120aa8fa8e34371e6476cbbb540dcba31b8df260ddf224fcbf6ae6e5" gracePeriod=30 Apr 22 19:02:47.987750 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:47.987635 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kube-rbac-proxy" containerID="cri-o://e00b9f02b0f5ef57dcd64ce34057f8b259bc5a67509abc17cee3c012925dbd02" gracePeriod=30 Apr 22 19:02:48.080415 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080385 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:02:48.080675 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080662 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kube-rbac-proxy" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080677 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kube-rbac-proxy" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080686 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080692 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080705 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="storage-initializer" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080711 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="storage-initializer" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080717 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" Apr 22 19:02:48.080723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080722 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080729 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kserve-container" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080734 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kserve-container" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080744 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080749 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080792 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kserve-container" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080800 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5c8200e0-1c22-4bc8-bc1c-74b966e8f9e9" containerName="kube-rbac-proxy" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080807 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kserve-container" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080813 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="agent" Apr 22 19:02:48.080924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.080819 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="217fb72f-0bc2-4512-b29b-42d704a0f1cc" containerName="kube-rbac-proxy" Apr 22 19:02:48.083563 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.083544 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.086162 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.086147 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-predictor-serving-cert\"" Apr 22 19:02:48.086221 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.086197 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:02:48.095016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.094995 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:02:48.144135 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.144104 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.144135 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.144148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.144370 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.144175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.144370 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.144236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htbgl\" (UniqueName: \"kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.245625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.245532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.245625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.245578 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.245625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.245608 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.245625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.245627 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-htbgl\" (UniqueName: \"kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.245940 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:02:48.245695 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-serving-cert: secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 22 19:02:48.245940 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:02:48.245779 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls podName:fd41cc09-9ffe-4f84-9939-a4de20035ff4 nodeName:}" failed. No retries permitted until 2026-04-22 19:02:48.745758645 +0000 UTC m=+980.745332525 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls") pod "isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" (UID: "fd41cc09-9ffe-4f84-9939-a4de20035ff4") : secret "isvc-lightgbm-runtime-predictor-serving-cert" not found Apr 22 19:02:48.246056 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.245988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.246308 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.246291 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.254505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.254482 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-htbgl\" (UniqueName: \"kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.389031 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.388995 2579 generic.go:358] "Generic (PLEG): container finished" podID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerID="e00b9f02b0f5ef57dcd64ce34057f8b259bc5a67509abc17cee3c012925dbd02" exitCode=2 Apr 22 19:02:48.389194 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.389065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerDied","Data":"e00b9f02b0f5ef57dcd64ce34057f8b259bc5a67509abc17cee3c012925dbd02"} Apr 22 19:02:48.749470 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.749434 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.751830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.751798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") pod \"isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:48.993301 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:48.993241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:49.109228 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:49.109206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:02:49.111833 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:02:49.111800 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd41cc09_9ffe_4f84_9939_a4de20035ff4.slice/crio-de0ca4e912aa2344cdacc0a837466881cfe6bf4345e9762176141f2ca2785398 WatchSource:0}: Error finding container de0ca4e912aa2344cdacc0a837466881cfe6bf4345e9762176141f2ca2785398: Status 404 returned error can't find the container with id de0ca4e912aa2344cdacc0a837466881cfe6bf4345e9762176141f2ca2785398 Apr 22 19:02:49.392976 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:49.392893 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerStarted","Data":"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b"} Apr 22 19:02:49.392976 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:49.392933 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerStarted","Data":"de0ca4e912aa2344cdacc0a837466881cfe6bf4345e9762176141f2ca2785398"} Apr 22 19:02:50.142706 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:50.142664 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.25:8643/healthz\": dial tcp 10.132.0.25:8643: connect: connection refused" Apr 22 19:02:50.148453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:50.148422 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 22 19:02:52.403408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.403374 2579 generic.go:358] "Generic (PLEG): container finished" podID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerID="229e3568120aa8fa8e34371e6476cbbb540dcba31b8df260ddf224fcbf6ae6e5" exitCode=0 Apr 22 19:02:52.403743 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.403429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerDied","Data":"229e3568120aa8fa8e34371e6476cbbb540dcba31b8df260ddf224fcbf6ae6e5"} Apr 22 19:02:52.423935 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.423915 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:02:52.581296 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581194 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config\") pod \"07d85718-3e0b-4f3b-b3af-da286afb0dde\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " Apr 22 19:02:52.581296 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581232 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55q89\" (UniqueName: \"kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89\") pod \"07d85718-3e0b-4f3b-b3af-da286afb0dde\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " Apr 22 19:02:52.581505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581306 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") pod \"07d85718-3e0b-4f3b-b3af-da286afb0dde\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " Apr 22 19:02:52.581505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581333 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location\") pod \"07d85718-3e0b-4f3b-b3af-da286afb0dde\" (UID: \"07d85718-3e0b-4f3b-b3af-da286afb0dde\") " Apr 22 19:02:52.581621 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581544 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-kube-rbac-proxy-sar-config") pod "07d85718-3e0b-4f3b-b3af-da286afb0dde" (UID: "07d85718-3e0b-4f3b-b3af-da286afb0dde"). InnerVolumeSpecName "isvc-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:02:52.581783 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.581759 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "07d85718-3e0b-4f3b-b3af-da286afb0dde" (UID: "07d85718-3e0b-4f3b-b3af-da286afb0dde"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:02:52.583297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.583275 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "07d85718-3e0b-4f3b-b3af-da286afb0dde" (UID: "07d85718-3e0b-4f3b-b3af-da286afb0dde"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:02:52.583404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.583386 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89" (OuterVolumeSpecName: "kube-api-access-55q89") pod "07d85718-3e0b-4f3b-b3af-da286afb0dde" (UID: "07d85718-3e0b-4f3b-b3af-da286afb0dde"). InnerVolumeSpecName "kube-api-access-55q89". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:02:52.682587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.682554 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/07d85718-3e0b-4f3b-b3af-da286afb0dde-isvc-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:02:52.682587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.682581 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-55q89\" (UniqueName: \"kubernetes.io/projected/07d85718-3e0b-4f3b-b3af-da286afb0dde-kube-api-access-55q89\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:02:52.682587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.682591 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07d85718-3e0b-4f3b-b3af-da286afb0dde-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:02:52.682806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:52.682603 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/07d85718-3e0b-4f3b-b3af-da286afb0dde-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:02:53.408432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.408398 2579 generic.go:358] "Generic (PLEG): container finished" podID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerID="1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b" exitCode=0 Apr 22 19:02:53.408838 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.408475 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerDied","Data":"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b"} Apr 22 19:02:53.410207 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.410186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" event={"ID":"07d85718-3e0b-4f3b-b3af-da286afb0dde","Type":"ContainerDied","Data":"2ab1abbb6306e176085d5970641a58dc76cacbb12e9e2fe7a509c1968aedd517"} Apr 22 19:02:53.410339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.410216 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r" Apr 22 19:02:53.410339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.410222 2579 scope.go:117] "RemoveContainer" containerID="e00b9f02b0f5ef57dcd64ce34057f8b259bc5a67509abc17cee3c012925dbd02" Apr 22 19:02:53.420066 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.420047 2579 scope.go:117] "RemoveContainer" containerID="229e3568120aa8fa8e34371e6476cbbb540dcba31b8df260ddf224fcbf6ae6e5" Apr 22 19:02:53.432687 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.432667 2579 scope.go:117] "RemoveContainer" containerID="0fafe683a8a0814b28b642ae93184bfc6a06fd348464f26d5c15475ce0ea55af" Apr 22 19:02:53.443856 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.443831 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:02:53.447879 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:53.447857 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-bdf964bd-n5g9r"] Apr 22 19:02:54.415705 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.415668 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerStarted","Data":"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b"} Apr 22 19:02:54.416141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.415712 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerStarted","Data":"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682"} Apr 22 19:02:54.416141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.416017 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:54.416141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.416043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:02:54.417412 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.417379 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:02:54.435414 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.435374 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podStartSLOduration=6.435361788 podStartE2EDuration="6.435361788s" podCreationTimestamp="2026-04-22 19:02:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:02:54.434442821 +0000 UTC m=+986.434016717" watchObservedRunningTime="2026-04-22 19:02:54.435361788 +0000 UTC m=+986.434935688" Apr 22 19:02:54.542874 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:54.542841 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" path="/var/lib/kubelet/pods/07d85718-3e0b-4f3b-b3af-da286afb0dde/volumes" Apr 22 19:02:55.420176 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:02:55.420135 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:00.425086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:00.425056 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:03:00.425665 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:00.425639 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:10.426512 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:10.426475 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:20.425979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:20.425939 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:30.426215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:30.426163 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:40.426366 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:40.426322 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:03:50.426377 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:03:50.426332 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:04:00.426366 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:00.426320 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:04:10.426412 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:10.426383 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:04:18.412888 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.412853 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:04:18.413371 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.413283 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" containerID="cri-o://7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682" gracePeriod=30 Apr 22 19:04:18.413440 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.413338 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kube-rbac-proxy" containerID="cri-o://0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b" gracePeriod=30 Apr 22 19:04:18.502516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502485 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:04:18.502824 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502807 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="storage-initializer" Apr 22 19:04:18.502901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502827 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="storage-initializer" Apr 22 19:04:18.502901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502838 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kube-rbac-proxy" Apr 22 19:04:18.502901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502846 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kube-rbac-proxy" Apr 22 19:04:18.502901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502874 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" Apr 22 19:04:18.502901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502882 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" Apr 22 19:04:18.503160 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502946 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kube-rbac-proxy" Apr 22 19:04:18.503160 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.502959 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="07d85718-3e0b-4f3b-b3af-da286afb0dde" containerName="kserve-container" Apr 22 19:04:18.505238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.505217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.507942 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.507920 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-predictor-serving-cert\"" Apr 22 19:04:18.508063 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.507940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:04:18.515710 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.515684 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:04:18.601429 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.601390 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d95b\" (UniqueName: \"kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.601429 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.601434 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.601645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.601469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.601645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.601513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.644786 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.644757 2579 generic.go:358] "Generic (PLEG): container finished" podID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerID="0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b" exitCode=2 Apr 22 19:04:18.644936 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.644792 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerDied","Data":"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b"} Apr 22 19:04:18.702846 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.702822 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.702956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.702871 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2d95b\" (UniqueName: \"kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.702956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.702908 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.703045 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.702953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.703407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.703390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.703641 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.703624 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.705358 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.705340 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.712598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.712579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d95b\" (UniqueName: \"kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b\") pod \"isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.815343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.815296 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:04:18.931465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:18.931409 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:04:18.933513 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:04:18.933485 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7107159_6fcc_48f6_b147_a1c7aa1972b4.slice/crio-5bbe7b202d90fe5ff987de6bb2cf248f262ac8efd034e0d6b2c8d8cd3850343f WatchSource:0}: Error finding container 5bbe7b202d90fe5ff987de6bb2cf248f262ac8efd034e0d6b2c8d8cd3850343f: Status 404 returned error can't find the container with id 5bbe7b202d90fe5ff987de6bb2cf248f262ac8efd034e0d6b2c8d8cd3850343f Apr 22 19:04:19.653556 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:19.653514 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerStarted","Data":"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3"} Apr 22 19:04:19.653556 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:19.653554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerStarted","Data":"5bbe7b202d90fe5ff987de6bb2cf248f262ac8efd034e0d6b2c8d8cd3850343f"} Apr 22 19:04:20.420824 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:20.420782 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.26:8643/healthz\": dial tcp 10.132.0.26:8643: connect: connection refused" Apr 22 19:04:20.426114 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:20.426092 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:8080: connect: connection refused" Apr 22 19:04:22.663651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:22.663620 2579 generic.go:358] "Generic (PLEG): container finished" podID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerID="e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3" exitCode=0 Apr 22 19:04:22.663927 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:22.663695 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerDied","Data":"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3"} Apr 22 19:04:22.944561 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:22.944537 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:04:23.034668 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.034636 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location\") pod \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " Apr 22 19:04:23.034821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.034679 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") pod \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " Apr 22 19:04:23.034821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.034706 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htbgl\" (UniqueName: \"kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl\") pod \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " Apr 22 19:04:23.034821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.034764 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") pod \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\" (UID: \"fd41cc09-9ffe-4f84-9939-a4de20035ff4\") " Apr 22 19:04:23.034943 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.034909 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fd41cc09-9ffe-4f84-9939-a4de20035ff4" (UID: "fd41cc09-9ffe-4f84-9939-a4de20035ff4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:04:23.035130 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.035108 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config") pod "fd41cc09-9ffe-4f84-9939-a4de20035ff4" (UID: "fd41cc09-9ffe-4f84-9939-a4de20035ff4"). InnerVolumeSpecName "isvc-lightgbm-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:04:23.036746 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.036719 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fd41cc09-9ffe-4f84-9939-a4de20035ff4" (UID: "fd41cc09-9ffe-4f84-9939-a4de20035ff4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:04:23.036845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.036787 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl" (OuterVolumeSpecName: "kube-api-access-htbgl") pod "fd41cc09-9ffe-4f84-9939-a4de20035ff4" (UID: "fd41cc09-9ffe-4f84-9939-a4de20035ff4"). InnerVolumeSpecName "kube-api-access-htbgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:04:23.135565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.135530 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/fd41cc09-9ffe-4f84-9939-a4de20035ff4-isvc-lightgbm-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:04:23.135565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.135558 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:04:23.135565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.135569 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fd41cc09-9ffe-4f84-9939-a4de20035ff4-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:04:23.135782 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.135579 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-htbgl\" (UniqueName: \"kubernetes.io/projected/fd41cc09-9ffe-4f84-9939-a4de20035ff4-kube-api-access-htbgl\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:04:23.669656 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.669617 2579 generic.go:358] "Generic (PLEG): container finished" podID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerID="7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682" exitCode=0 Apr 22 19:04:23.670112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.669704 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerDied","Data":"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682"} Apr 22 19:04:23.670112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.669736 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" event={"ID":"fd41cc09-9ffe-4f84-9939-a4de20035ff4","Type":"ContainerDied","Data":"de0ca4e912aa2344cdacc0a837466881cfe6bf4345e9762176141f2ca2785398"} Apr 22 19:04:23.670112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.669756 2579 scope.go:117] "RemoveContainer" containerID="0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b" Apr 22 19:04:23.670112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.669938 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk" Apr 22 19:04:23.685823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.685625 2579 scope.go:117] "RemoveContainer" containerID="7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682" Apr 22 19:04:23.699569 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.699421 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:04:23.699920 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.699835 2579 scope.go:117] "RemoveContainer" containerID="1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b" Apr 22 19:04:23.706461 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.706384 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-749c4f6d58-7gwrk"] Apr 22 19:04:23.716703 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.716635 2579 scope.go:117] "RemoveContainer" containerID="0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b" Apr 22 19:04:23.717456 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:04:23.717303 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b\": container with ID starting with 0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b not found: ID does not exist" containerID="0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b" Apr 22 19:04:23.717456 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.717343 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b"} err="failed to get container status \"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b\": rpc error: code = NotFound desc = could not find container \"0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b\": container with ID starting with 0f37226b99dee22ebb9b458d9aef9b60d8e67d105df10b990f600d9c91a7724b not found: ID does not exist" Apr 22 19:04:23.717456 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.717374 2579 scope.go:117] "RemoveContainer" containerID="7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682" Apr 22 19:04:23.717914 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:04:23.717885 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682\": container with ID starting with 7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682 not found: ID does not exist" containerID="7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682" Apr 22 19:04:23.718007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.717919 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682"} err="failed to get container status \"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682\": rpc error: code = NotFound desc = could not find container \"7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682\": container with ID starting with 7751ad0bde672d7c7d6ff82b649959d27b89feea0eb777566618d4eea6686682 not found: ID does not exist" Apr 22 19:04:23.718007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.717940 2579 scope.go:117] "RemoveContainer" containerID="1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b" Apr 22 19:04:23.718766 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:04:23.718741 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b\": container with ID starting with 1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b not found: ID does not exist" containerID="1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b" Apr 22 19:04:23.718859 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:23.718774 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b"} err="failed to get container status \"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b\": rpc error: code = NotFound desc = could not find container \"1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b\": container with ID starting with 1f3d615db73bcd325e2cd7706f67d6176207ceb4f5f7b1ea0c523bb35946205b not found: ID does not exist" Apr 22 19:04:24.547077 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:04:24.546577 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" path="/var/lib/kubelet/pods/fd41cc09-9ffe-4f84-9939-a4de20035ff4/volumes" Apr 22 19:06:49.504600 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:49.504578 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:06:50.116799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:50.116761 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerStarted","Data":"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b"} Apr 22 19:06:50.116799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:50.116798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerStarted","Data":"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a"} Apr 22 19:06:50.117004 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:50.116877 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:06:50.145929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:50.145861 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" podStartSLOduration=5.440451984 podStartE2EDuration="2m32.145841273s" podCreationTimestamp="2026-04-22 19:04:18 +0000 UTC" firstStartedPulling="2026-04-22 19:04:22.664724598 +0000 UTC m=+1074.664298474" lastFinishedPulling="2026-04-22 19:06:49.370113873 +0000 UTC m=+1221.369687763" observedRunningTime="2026-04-22 19:06:50.142907751 +0000 UTC m=+1222.142481683" watchObservedRunningTime="2026-04-22 19:06:50.145841273 +0000 UTC m=+1222.145415174" Apr 22 19:06:51.119944 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:51.119915 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:06:57.128162 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:06:57.128132 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:07:27.131675 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:27.131645 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:07:28.695913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.695882 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:07:28.696455 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.696299 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kserve-container" containerID="cri-o://ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a" gracePeriod=30 Apr 22 19:07:28.696455 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.696323 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kube-rbac-proxy" containerID="cri-o://1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b" gracePeriod=30 Apr 22 19:07:28.776947 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.776915 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:07:28.777198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777185 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="storage-initializer" Apr 22 19:07:28.777198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777198 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="storage-initializer" Apr 22 19:07:28.777297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777212 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kube-rbac-proxy" Apr 22 19:07:28.777297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777217 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kube-rbac-proxy" Apr 22 19:07:28.777297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777225 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" Apr 22 19:07:28.777297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777230 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" Apr 22 19:07:28.777297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777295 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kserve-container" Apr 22 19:07:28.777465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.777310 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd41cc09-9ffe-4f84-9939-a4de20035ff4" containerName="kube-rbac-proxy" Apr 22 19:07:28.780560 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.780527 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.783405 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.783381 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 22 19:07:28.783535 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.783418 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-lightgbm-v2-kserve-predictor-serving-cert\"" Apr 22 19:07:28.790925 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.790903 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:07:28.837546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.837506 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.837721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.837558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.837721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.837593 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fx4b\" (UniqueName: \"kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.837721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.837630 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.938315 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938277 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.938526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938325 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.938526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938362 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fx4b\" (UniqueName: \"kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.938526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938401 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.938526 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:07:28.938504 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-serving-cert: secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 22 19:07:28.938738 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:07:28.938594 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls podName:1a10f999-af49-41ae-8aa5-3dd695ce80ee nodeName:}" failed. No retries permitted until 2026-04-22 19:07:29.438572202 +0000 UTC m=+1261.438146077 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls") pod "isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" (UID: "1a10f999-af49-41ae-8aa5-3dd695ce80ee") : secret "isvc-lightgbm-v2-kserve-predictor-serving-cert" not found Apr 22 19:07:28.938738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938706 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.939004 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.938978 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:28.949787 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:28.949721 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fx4b\" (UniqueName: \"kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:29.220666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.220587 2579 generic.go:358] "Generic (PLEG): container finished" podID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerID="1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b" exitCode=2 Apr 22 19:07:29.220799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.220661 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerDied","Data":"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b"} Apr 22 19:07:29.442626 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.442589 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:29.444960 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.444933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") pod \"isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:29.690722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.690686 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:29.808815 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:29.808785 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:07:29.811458 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:07:29.811419 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a10f999_af49_41ae_8aa5_3dd695ce80ee.slice/crio-01eca49a935226aa280ef508770c6e8ce8bb8dc0233f2b500d01909009e4ef94 WatchSource:0}: Error finding container 01eca49a935226aa280ef508770c6e8ce8bb8dc0233f2b500d01909009e4ef94: Status 404 returned error can't find the container with id 01eca49a935226aa280ef508770c6e8ce8bb8dc0233f2b500d01909009e4ef94 Apr 22 19:07:30.225487 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:30.225445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerStarted","Data":"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb"} Apr 22 19:07:30.225487 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:30.225489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerStarted","Data":"01eca49a935226aa280ef508770c6e8ce8bb8dc0233f2b500d01909009e4ef94"} Apr 22 19:07:32.122474 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.122437 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.27:8643/healthz\": dial tcp 10.132.0.27:8643: connect: connection refused" Apr 22 19:07:32.725128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.725107 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:07:32.767669 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.767642 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") pod \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " Apr 22 19:07:32.767795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.767691 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location\") pod \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " Apr 22 19:07:32.767795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.767724 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls\") pod \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " Apr 22 19:07:32.767795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.767762 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d95b\" (UniqueName: \"kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b\") pod \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\" (UID: \"c7107159-6fcc-48f6-b147-a1c7aa1972b4\") " Apr 22 19:07:32.768030 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.768003 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config") pod "c7107159-6fcc-48f6-b147-a1c7aa1972b4" (UID: "c7107159-6fcc-48f6-b147-a1c7aa1972b4"). InnerVolumeSpecName "isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:32.768078 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.768041 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7107159-6fcc-48f6-b147-a1c7aa1972b4" (UID: "c7107159-6fcc-48f6-b147-a1c7aa1972b4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:32.769781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.769755 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c7107159-6fcc-48f6-b147-a1c7aa1972b4" (UID: "c7107159-6fcc-48f6-b147-a1c7aa1972b4"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:32.769849 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.769785 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b" (OuterVolumeSpecName: "kube-api-access-2d95b") pod "c7107159-6fcc-48f6-b147-a1c7aa1972b4" (UID: "c7107159-6fcc-48f6-b147-a1c7aa1972b4"). InnerVolumeSpecName "kube-api-access-2d95b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:32.868345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.868227 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c7107159-6fcc-48f6-b147-a1c7aa1972b4-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:32.868345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.868255 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2d95b\" (UniqueName: \"kubernetes.io/projected/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kube-api-access-2d95b\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:32.868345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.868291 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c7107159-6fcc-48f6-b147-a1c7aa1972b4-isvc-lightgbm-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:32.868345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:32.868302 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7107159-6fcc-48f6-b147-a1c7aa1972b4-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:33.235781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.235750 2579 generic.go:358] "Generic (PLEG): container finished" podID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerID="ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a" exitCode=0 Apr 22 19:07:33.236147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.235824 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" Apr 22 19:07:33.236147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.235822 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerDied","Data":"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a"} Apr 22 19:07:33.236147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.235928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl" event={"ID":"c7107159-6fcc-48f6-b147-a1c7aa1972b4","Type":"ContainerDied","Data":"5bbe7b202d90fe5ff987de6bb2cf248f262ac8efd034e0d6b2c8d8cd3850343f"} Apr 22 19:07:33.236147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.235943 2579 scope.go:117] "RemoveContainer" containerID="1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b" Apr 22 19:07:33.244459 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.244439 2579 scope.go:117] "RemoveContainer" containerID="ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a" Apr 22 19:07:33.251334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.251313 2579 scope.go:117] "RemoveContainer" containerID="e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3" Apr 22 19:07:33.257605 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.257583 2579 scope.go:117] "RemoveContainer" containerID="1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b" Apr 22 19:07:33.257903 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:07:33.257877 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b\": container with ID starting with 1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b not found: ID does not exist" containerID="1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b" Apr 22 19:07:33.258114 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.257913 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b"} err="failed to get container status \"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b\": rpc error: code = NotFound desc = could not find container \"1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b\": container with ID starting with 1788f5e9af0d3e5dd4b42e398787c3f13127f3ff661b996b0d78bd976736768b not found: ID does not exist" Apr 22 19:07:33.258114 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.257937 2579 scope.go:117] "RemoveContainer" containerID="ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a" Apr 22 19:07:33.258366 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:07:33.258346 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a\": container with ID starting with ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a not found: ID does not exist" containerID="ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a" Apr 22 19:07:33.258468 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.258375 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a"} err="failed to get container status \"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a\": rpc error: code = NotFound desc = could not find container \"ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a\": container with ID starting with ab9da6cafc87772d48ba08238b9852bb4f5908e3acc7bffbd5b2cbd09c231a4a not found: ID does not exist" Apr 22 19:07:33.258468 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.258399 2579 scope.go:117] "RemoveContainer" containerID="e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3" Apr 22 19:07:33.258636 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:07:33.258617 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3\": container with ID starting with e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3 not found: ID does not exist" containerID="e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3" Apr 22 19:07:33.258686 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.258643 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3"} err="failed to get container status \"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3\": rpc error: code = NotFound desc = could not find container \"e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3\": container with ID starting with e7de0b8a149d5b910e7a18e5c86f0476b73d0fd470d0827da62102ca468f78c3 not found: ID does not exist" Apr 22 19:07:33.260206 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.260185 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:07:33.263676 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:33.263656 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8765c9667-bl6pl"] Apr 22 19:07:34.245600 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:34.245567 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerID="60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb" exitCode=0 Apr 22 19:07:34.246211 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:34.245619 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerDied","Data":"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb"} Apr 22 19:07:34.544123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:34.544044 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" path="/var/lib/kubelet/pods/c7107159-6fcc-48f6-b147-a1c7aa1972b4/volumes" Apr 22 19:07:35.251152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.251117 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerStarted","Data":"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891"} Apr 22 19:07:35.251152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.251156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerStarted","Data":"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303"} Apr 22 19:07:35.251609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.251441 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:35.251609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.251540 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:35.252972 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.252947 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 22 19:07:35.270893 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:35.270843 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podStartSLOduration=7.270830837 podStartE2EDuration="7.270830837s" podCreationTimestamp="2026-04-22 19:07:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:07:35.269554102 +0000 UTC m=+1267.269128022" watchObservedRunningTime="2026-04-22 19:07:35.270830837 +0000 UTC m=+1267.270404733" Apr 22 19:07:36.255142 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:36.255096 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 22 19:07:41.259392 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:41.259357 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:41.259919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:41.259894 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 22 19:07:51.260412 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:51.260383 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:58.831513 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.831476 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:07:58.831959 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.831906 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" containerID="cri-o://98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303" gracePeriod=30 Apr 22 19:07:58.832103 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.831927 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kube-rbac-proxy" containerID="cri-o://a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891" gracePeriod=30 Apr 22 19:07:58.912682 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912648 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:07:58.912951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912937 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kube-rbac-proxy" Apr 22 19:07:58.912992 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912953 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kube-rbac-proxy" Apr 22 19:07:58.912992 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912969 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="storage-initializer" Apr 22 19:07:58.912992 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912975 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="storage-initializer" Apr 22 19:07:58.912992 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912981 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kserve-container" Apr 22 19:07:58.912992 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.912987 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kserve-container" Apr 22 19:07:58.913139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.913034 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kube-rbac-proxy" Apr 22 19:07:58.913139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.913043 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7107159-6fcc-48f6-b147-a1c7aa1972b4" containerName="kserve-container" Apr 22 19:07:58.915916 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.915899 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:58.918587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.918567 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-predictor-serving-cert\"" Apr 22 19:07:58.918700 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.918594 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:07:58.925149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:58.925125 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:07:59.064125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.064093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.064344 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.064216 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzksh\" (UniqueName: \"kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.064344 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.064318 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.064445 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.064362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165037 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165080 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165186 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hzksh\" (UniqueName: \"kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165672 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.165859 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.165837 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.167829 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.167806 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.173938 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.173916 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzksh\" (UniqueName: \"kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh\") pod \"isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.226817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.226769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:07:59.321917 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.321870 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerID="a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891" exitCode=2 Apr 22 19:07:59.322099 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.321930 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerDied","Data":"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891"} Apr 22 19:07:59.354404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.354344 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:07:59.358024 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:07:59.357993 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29f65142_67d4_4772_8b6f_68498a0c9de3.slice/crio-84d960cca55dcded139e640b1765be3813a1830a415533decb54e80bef0afe17 WatchSource:0}: Error finding container 84d960cca55dcded139e640b1765be3813a1830a415533decb54e80bef0afe17: Status 404 returned error can't find the container with id 84d960cca55dcded139e640b1765be3813a1830a415533decb54e80bef0afe17 Apr 22 19:07:59.560653 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.560628 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:07:59.669796 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.669759 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fx4b\" (UniqueName: \"kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b\") pod \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " Apr 22 19:07:59.669796 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.669803 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") pod \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " Apr 22 19:07:59.670075 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.669842 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") pod \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " Apr 22 19:07:59.670075 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.669879 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location\") pod \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\" (UID: \"1a10f999-af49-41ae-8aa5-3dd695ce80ee\") " Apr 22 19:07:59.670276 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.670227 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config") pod "1a10f999-af49-41ae-8aa5-3dd695ce80ee" (UID: "1a10f999-af49-41ae-8aa5-3dd695ce80ee"). InnerVolumeSpecName "isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:07:59.670363 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.670333 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1a10f999-af49-41ae-8aa5-3dd695ce80ee" (UID: "1a10f999-af49-41ae-8aa5-3dd695ce80ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:07:59.672426 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.672399 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "1a10f999-af49-41ae-8aa5-3dd695ce80ee" (UID: "1a10f999-af49-41ae-8aa5-3dd695ce80ee"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:07:59.672546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.672454 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b" (OuterVolumeSpecName: "kube-api-access-6fx4b") pod "1a10f999-af49-41ae-8aa5-3dd695ce80ee" (UID: "1a10f999-af49-41ae-8aa5-3dd695ce80ee"). InnerVolumeSpecName "kube-api-access-6fx4b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:07:59.770758 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.770654 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:59.770758 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.770684 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6fx4b\" (UniqueName: \"kubernetes.io/projected/1a10f999-af49-41ae-8aa5-3dd695ce80ee-kube-api-access-6fx4b\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:59.770758 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.770694 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/1a10f999-af49-41ae-8aa5-3dd695ce80ee-isvc-lightgbm-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:07:59.770758 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:07:59.770704 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1a10f999-af49-41ae-8aa5-3dd695ce80ee-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:08:00.326785 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.326750 2579 generic.go:358] "Generic (PLEG): container finished" podID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerID="98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303" exitCode=0 Apr 22 19:08:00.327254 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.326831 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" Apr 22 19:08:00.327254 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.326832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerDied","Data":"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303"} Apr 22 19:08:00.327254 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.326941 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp" event={"ID":"1a10f999-af49-41ae-8aa5-3dd695ce80ee","Type":"ContainerDied","Data":"01eca49a935226aa280ef508770c6e8ce8bb8dc0233f2b500d01909009e4ef94"} Apr 22 19:08:00.327254 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.326965 2579 scope.go:117] "RemoveContainer" containerID="a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891" Apr 22 19:08:00.328208 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.328100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerStarted","Data":"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28"} Apr 22 19:08:00.328208 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.328123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerStarted","Data":"84d960cca55dcded139e640b1765be3813a1830a415533decb54e80bef0afe17"} Apr 22 19:08:00.334633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.334614 2579 scope.go:117] "RemoveContainer" containerID="98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303" Apr 22 19:08:00.341705 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.341687 2579 scope.go:117] "RemoveContainer" containerID="60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb" Apr 22 19:08:00.347991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.347976 2579 scope.go:117] "RemoveContainer" containerID="a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891" Apr 22 19:08:00.348237 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:00.348218 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891\": container with ID starting with a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891 not found: ID does not exist" containerID="a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891" Apr 22 19:08:00.348303 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.348246 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891"} err="failed to get container status \"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891\": rpc error: code = NotFound desc = could not find container \"a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891\": container with ID starting with a78dec6bc3dfbc12c169dd4cce4908d255454ffeb64ed18fcb97266547d9c891 not found: ID does not exist" Apr 22 19:08:00.348303 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.348280 2579 scope.go:117] "RemoveContainer" containerID="98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303" Apr 22 19:08:00.348509 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:00.348491 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303\": container with ID starting with 98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303 not found: ID does not exist" containerID="98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303" Apr 22 19:08:00.348573 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.348521 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303"} err="failed to get container status \"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303\": rpc error: code = NotFound desc = could not find container \"98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303\": container with ID starting with 98236f2f24dcdfd47310aa6c5bee67b2ff47f96eac4d0d9a913f72bba5d8b303 not found: ID does not exist" Apr 22 19:08:00.348573 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.348545 2579 scope.go:117] "RemoveContainer" containerID="60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb" Apr 22 19:08:00.348754 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:00.348736 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb\": container with ID starting with 60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb not found: ID does not exist" containerID="60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb" Apr 22 19:08:00.348789 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.348758 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb"} err="failed to get container status \"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb\": rpc error: code = NotFound desc = could not find container \"60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb\": container with ID starting with 60fb034cbd366fd0ef1f3f5225ba5f6e73b500d9fcfb8095c9da720b019507eb not found: ID does not exist" Apr 22 19:08:00.366366 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.366327 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:08:00.369218 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.369193 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-559bf6989-x5gcp"] Apr 22 19:08:00.543122 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:00.543090 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" path="/var/lib/kubelet/pods/1a10f999-af49-41ae-8aa5-3dd695ce80ee/volumes" Apr 22 19:08:04.343432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:04.343397 2579 generic.go:358] "Generic (PLEG): container finished" podID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerID="a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28" exitCode=0 Apr 22 19:08:04.343805 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:04.343457 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerDied","Data":"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28"} Apr 22 19:08:05.347722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:05.347688 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerStarted","Data":"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d"} Apr 22 19:08:05.348168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:05.347730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerStarted","Data":"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0"} Apr 22 19:08:05.348168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:05.348033 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:05.369294 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:05.369232 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" podStartSLOduration=7.36922084 podStartE2EDuration="7.36922084s" podCreationTimestamp="2026-04-22 19:07:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:08:05.367031975 +0000 UTC m=+1297.366605878" watchObservedRunningTime="2026-04-22 19:08:05.36922084 +0000 UTC m=+1297.368794737" Apr 22 19:08:06.351558 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:06.351531 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:12.359705 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:12.359675 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:42.364001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:42.363963 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:49.077163 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.077091 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:08:49.077598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.077434 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kserve-container" containerID="cri-o://f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0" gracePeriod=30 Apr 22 19:08:49.077598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.077460 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kube-rbac-proxy" containerID="cri-o://8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d" gracePeriod=30 Apr 22 19:08:49.094113 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094088 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:08:49.094446 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094421 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" Apr 22 19:08:49.094446 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094447 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094462 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kube-rbac-proxy" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094470 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kube-rbac-proxy" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094486 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="storage-initializer" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094492 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="storage-initializer" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094567 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kserve-container" Apr 22 19:08:49.094577 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.094574 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="1a10f999-af49-41ae-8aa5-3dd695ce80ee" containerName="kube-rbac-proxy" Apr 22 19:08:49.098965 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.098948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.101632 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.101610 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-predictor-serving-cert\"" Apr 22 19:08:49.101715 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.101610 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\"" Apr 22 19:08:49.108161 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.108139 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:08:49.146286 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.146238 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.146436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.146322 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss5sx\" (UniqueName: \"kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.146436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.146352 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.146436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.146376 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.246920 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.246888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss5sx\" (UniqueName: \"kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.247100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.246934 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.247100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.246955 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.247100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.246995 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.247462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.247440 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.247626 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.247608 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.249436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.249415 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.256236 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.256216 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss5sx\" (UniqueName: \"kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx\") pod \"isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.409624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.409541 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:49.477806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.477739 2579 generic.go:358] "Generic (PLEG): container finished" podID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerID="8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d" exitCode=2 Apr 22 19:08:49.477806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.477773 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerDied","Data":"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d"} Apr 22 19:08:49.538231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:49.538206 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:08:49.540728 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:08:49.540698 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30898fcc_0304_4ea5_96c5_1137194c01a9.slice/crio-2b14b9323ec937e47f5c5885153c782fd00bfa909384625fec0699d79a82c4eb WatchSource:0}: Error finding container 2b14b9323ec937e47f5c5885153c782fd00bfa909384625fec0699d79a82c4eb: Status 404 returned error can't find the container with id 2b14b9323ec937e47f5c5885153c782fd00bfa909384625fec0699d79a82c4eb Apr 22 19:08:50.314842 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.314816 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:50.356177 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356095 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location\") pod \"29f65142-67d4-4772-8b6f-68498a0c9de3\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " Apr 22 19:08:50.356177 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356142 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hzksh\" (UniqueName: \"kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh\") pod \"29f65142-67d4-4772-8b6f-68498a0c9de3\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " Apr 22 19:08:50.356375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356196 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") pod \"29f65142-67d4-4772-8b6f-68498a0c9de3\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " Apr 22 19:08:50.356375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356239 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls\") pod \"29f65142-67d4-4772-8b6f-68498a0c9de3\" (UID: \"29f65142-67d4-4772-8b6f-68498a0c9de3\") " Apr 22 19:08:50.356556 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356530 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29f65142-67d4-4772-8b6f-68498a0c9de3" (UID: "29f65142-67d4-4772-8b6f-68498a0c9de3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:08:50.356610 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.356583 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config") pod "29f65142-67d4-4772-8b6f-68498a0c9de3" (UID: "29f65142-67d4-4772-8b6f-68498a0c9de3"). InnerVolumeSpecName "isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:08:50.358236 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.358216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh" (OuterVolumeSpecName: "kube-api-access-hzksh") pod "29f65142-67d4-4772-8b6f-68498a0c9de3" (UID: "29f65142-67d4-4772-8b6f-68498a0c9de3"). InnerVolumeSpecName "kube-api-access-hzksh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:08:50.358310 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.358234 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "29f65142-67d4-4772-8b6f-68498a0c9de3" (UID: "29f65142-67d4-4772-8b6f-68498a0c9de3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:08:50.456951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.456912 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/29f65142-67d4-4772-8b6f-68498a0c9de3-isvc-mlflow-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:08:50.456951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.456945 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/29f65142-67d4-4772-8b6f-68498a0c9de3-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:08:50.456951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.456957 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29f65142-67d4-4772-8b6f-68498a0c9de3-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:08:50.457167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.456967 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hzksh\" (UniqueName: \"kubernetes.io/projected/29f65142-67d4-4772-8b6f-68498a0c9de3-kube-api-access-hzksh\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:08:50.482438 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.482405 2579 generic.go:358] "Generic (PLEG): container finished" podID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerID="f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0" exitCode=0 Apr 22 19:08:50.482579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.482483 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" Apr 22 19:08:50.482579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.482499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerDied","Data":"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0"} Apr 22 19:08:50.482579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.482533 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r" event={"ID":"29f65142-67d4-4772-8b6f-68498a0c9de3","Type":"ContainerDied","Data":"84d960cca55dcded139e640b1765be3813a1830a415533decb54e80bef0afe17"} Apr 22 19:08:50.482579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.482552 2579 scope.go:117] "RemoveContainer" containerID="8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d" Apr 22 19:08:50.483845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.483818 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerStarted","Data":"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11"} Apr 22 19:08:50.483948 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.483853 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerStarted","Data":"2b14b9323ec937e47f5c5885153c782fd00bfa909384625fec0699d79a82c4eb"} Apr 22 19:08:50.490654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.490511 2579 scope.go:117] "RemoveContainer" containerID="f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0" Apr 22 19:08:50.497245 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.497230 2579 scope.go:117] "RemoveContainer" containerID="a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28" Apr 22 19:08:50.503364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.503349 2579 scope.go:117] "RemoveContainer" containerID="8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d" Apr 22 19:08:50.503593 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:50.503575 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d\": container with ID starting with 8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d not found: ID does not exist" containerID="8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d" Apr 22 19:08:50.503643 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.503601 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d"} err="failed to get container status \"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d\": rpc error: code = NotFound desc = could not find container \"8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d\": container with ID starting with 8aba4a223e7bf460bcc638426b4a7b0e57f63c51c1004f48919814e6fd6c7b0d not found: ID does not exist" Apr 22 19:08:50.503643 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.503619 2579 scope.go:117] "RemoveContainer" containerID="f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0" Apr 22 19:08:50.503822 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:50.503805 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0\": container with ID starting with f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0 not found: ID does not exist" containerID="f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0" Apr 22 19:08:50.503862 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.503828 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0"} err="failed to get container status \"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0\": rpc error: code = NotFound desc = could not find container \"f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0\": container with ID starting with f9d0621cdd9e3f4b4e5efd39f737d08eb74fb1cec31c0023994070eeecaca9b0 not found: ID does not exist" Apr 22 19:08:50.503862 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.503843 2579 scope.go:117] "RemoveContainer" containerID="a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28" Apr 22 19:08:50.504026 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:08:50.504012 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28\": container with ID starting with a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28 not found: ID does not exist" containerID="a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28" Apr 22 19:08:50.504067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.504029 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28"} err="failed to get container status \"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28\": rpc error: code = NotFound desc = could not find container \"a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28\": container with ID starting with a75ab763e16109a27e9c2090295ef0e231b8a80d3efa5408fcc9376c3527fb28 not found: ID does not exist" Apr 22 19:08:50.522952 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.522929 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:08:50.528122 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.528080 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5fdb47d546-t9j7r"] Apr 22 19:08:50.545408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:50.545257 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" path="/var/lib/kubelet/pods/29f65142-67d4-4772-8b6f-68498a0c9de3/volumes" Apr 22 19:08:53.493682 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:53.493647 2579 generic.go:358] "Generic (PLEG): container finished" podID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerID="79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11" exitCode=0 Apr 22 19:08:53.494049 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:53.493722 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerDied","Data":"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11"} Apr 22 19:08:54.498071 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:54.498036 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerStarted","Data":"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85"} Apr 22 19:08:57.511152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.511115 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerStarted","Data":"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9"} Apr 22 19:08:57.511152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.511156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerStarted","Data":"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f"} Apr 22 19:08:57.511618 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.511295 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:57.511618 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.511399 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:57.511618 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.511411 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:08:57.534145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:08:57.534095 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podStartSLOduration=5.389685131 podStartE2EDuration="8.534082705s" podCreationTimestamp="2026-04-22 19:08:49 +0000 UTC" firstStartedPulling="2026-04-22 19:08:53.548620264 +0000 UTC m=+1345.548194140" lastFinishedPulling="2026-04-22 19:08:56.693017826 +0000 UTC m=+1348.692591714" observedRunningTime="2026-04-22 19:08:57.532254685 +0000 UTC m=+1349.531828581" watchObservedRunningTime="2026-04-22 19:08:57.534082705 +0000 UTC m=+1349.533656601" Apr 22 19:09:03.521188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:09:03.521157 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:09:23.522380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:09:23.522341 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 19:09:33.523228 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:09:33.523198 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:03.523934 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:03.523902 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:09.162459 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.162424 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:10:09.162913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.162806 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" containerID="cri-o://540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85" gracePeriod=30 Apr 22 19:10:09.163098 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.163044 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" containerID="cri-o://9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9" gracePeriod=30 Apr 22 19:10:09.163252 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.163051 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-agent" containerID="cri-o://b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f" gracePeriod=30 Apr 22 19:10:09.237376 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237346 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:10:09.237644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237633 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="storage-initializer" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237646 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="storage-initializer" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237656 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kserve-container" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237661 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kserve-container" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237677 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kube-rbac-proxy" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237683 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kube-rbac-proxy" Apr 22 19:10:09.237730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237726 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kube-rbac-proxy" Apr 22 19:10:09.237907 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.237734 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="29f65142-67d4-4772-8b6f-68498a0c9de3" containerName="kserve-container" Apr 22 19:10:09.240917 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.240900 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.243441 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.243419 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-predictor-serving-cert\"" Apr 22 19:10:09.243524 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.243442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-kube-rbac-proxy-sar-config\"" Apr 22 19:10:09.250930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.250909 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:10:09.350199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.350163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.350388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.350208 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rf6k\" (UniqueName: \"kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.350388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.350349 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.350388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.350377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.450953 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.450920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.451138 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.450958 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5rf6k\" (UniqueName: \"kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.451138 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.451013 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.451138 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.451032 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.451567 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.451537 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.451791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.451772 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.453327 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.453304 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.460167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.460149 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rf6k\" (UniqueName: \"kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k\") pod \"isvc-paddle-predictor-6b8b7cfb4b-kwb5q\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.551374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.551303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:09.668889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.668858 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:10:09.671992 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:10:09.671968 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd940a366_8bdf_4628_8f8e_3587f3aeb333.slice/crio-6c91dbb6991106f18d779632b33ce5362eb9a760093810640e72f7953579fab7 WatchSource:0}: Error finding container 6c91dbb6991106f18d779632b33ce5362eb9a760093810640e72f7953579fab7: Status 404 returned error can't find the container with id 6c91dbb6991106f18d779632b33ce5362eb9a760093810640e72f7953579fab7 Apr 22 19:10:09.712232 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.712206 2579 generic.go:358] "Generic (PLEG): container finished" podID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerID="9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9" exitCode=2 Apr 22 19:10:09.712353 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.712284 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerDied","Data":"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9"} Apr 22 19:10:09.713452 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:09.713407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerStarted","Data":"6c91dbb6991106f18d779632b33ce5362eb9a760093810640e72f7953579fab7"} Apr 22 19:10:10.717250 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:10.717213 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerStarted","Data":"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021"} Apr 22 19:10:11.721945 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:11.721907 2579 generic.go:358] "Generic (PLEG): container finished" podID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerID="540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85" exitCode=0 Apr 22 19:10:11.722328 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:11.721982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerDied","Data":"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85"} Apr 22 19:10:13.515792 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:13.515733 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:13.521772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:13.521743 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 19:10:14.733919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:14.733886 2579 generic.go:358] "Generic (PLEG): container finished" podID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerID="c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021" exitCode=0 Apr 22 19:10:14.734302 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:14.733963 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerDied","Data":"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021"} Apr 22 19:10:18.515176 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:18.515131 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:23.515420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:23.515377 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:23.515995 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:23.515527 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:23.521923 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:23.521889 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 19:10:28.514837 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:28.514792 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:28.773433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:28.773351 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerStarted","Data":"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a"} Apr 22 19:10:28.773433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:28.773389 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerStarted","Data":"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d"} Apr 22 19:10:28.773648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:28.773636 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:28.793375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:28.793317 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podStartSLOduration=6.504304756 podStartE2EDuration="19.793302853s" podCreationTimestamp="2026-04-22 19:10:09 +0000 UTC" firstStartedPulling="2026-04-22 19:10:14.736005883 +0000 UTC m=+1426.735579759" lastFinishedPulling="2026-04-22 19:10:28.025003966 +0000 UTC m=+1440.024577856" observedRunningTime="2026-04-22 19:10:28.791531532 +0000 UTC m=+1440.791105442" watchObservedRunningTime="2026-04-22 19:10:28.793302853 +0000 UTC m=+1440.792876750" Apr 22 19:10:29.781903 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:29.781867 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:29.783161 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:29.783132 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:10:30.785484 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:30.785446 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:10:33.515362 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:33.515316 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:33.522787 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:33.522755 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.30:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.30:8080: connect: connection refused" Apr 22 19:10:33.522901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:33.522873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:35.789688 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:35.789656 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:10:35.790283 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:35.790211 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:10:38.515694 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:38.515654 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.30:8643/healthz\": dial tcp 10.132.0.30:8643: connect: connection refused" Apr 22 19:10:39.330564 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.330540 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:39.487994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.487947 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location\") pod \"30898fcc-0304-4ea5-96c5-1137194c01a9\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " Apr 22 19:10:39.488215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.488023 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss5sx\" (UniqueName: \"kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx\") pod \"30898fcc-0304-4ea5-96c5-1137194c01a9\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " Apr 22 19:10:39.488215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.488051 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls\") pod \"30898fcc-0304-4ea5-96c5-1137194c01a9\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " Apr 22 19:10:39.488215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.488070 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") pod \"30898fcc-0304-4ea5-96c5-1137194c01a9\" (UID: \"30898fcc-0304-4ea5-96c5-1137194c01a9\") " Apr 22 19:10:39.488419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.488284 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30898fcc-0304-4ea5-96c5-1137194c01a9" (UID: "30898fcc-0304-4ea5-96c5-1137194c01a9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:10:39.488545 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.488515 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-mcp-kube-rbac-proxy-sar-config") pod "30898fcc-0304-4ea5-96c5-1137194c01a9" (UID: "30898fcc-0304-4ea5-96c5-1137194c01a9"). InnerVolumeSpecName "isvc-sklearn-mcp-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:10:39.490220 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.490189 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "30898fcc-0304-4ea5-96c5-1137194c01a9" (UID: "30898fcc-0304-4ea5-96c5-1137194c01a9"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:10:39.490220 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.490194 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx" (OuterVolumeSpecName: "kube-api-access-ss5sx") pod "30898fcc-0304-4ea5-96c5-1137194c01a9" (UID: "30898fcc-0304-4ea5-96c5-1137194c01a9"). InnerVolumeSpecName "kube-api-access-ss5sx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:10:39.589178 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.589134 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30898fcc-0304-4ea5-96c5-1137194c01a9-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:10:39.589178 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.589176 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ss5sx\" (UniqueName: \"kubernetes.io/projected/30898fcc-0304-4ea5-96c5-1137194c01a9-kube-api-access-ss5sx\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:10:39.589178 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.589187 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/30898fcc-0304-4ea5-96c5-1137194c01a9-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:10:39.589178 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.589199 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-mcp-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/30898fcc-0304-4ea5-96c5-1137194c01a9-isvc-sklearn-mcp-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:10:39.812578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.812488 2579 generic.go:358] "Generic (PLEG): container finished" podID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerID="b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f" exitCode=137 Apr 22 19:10:39.812578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.812531 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerDied","Data":"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f"} Apr 22 19:10:39.812578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.812555 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" event={"ID":"30898fcc-0304-4ea5-96c5-1137194c01a9","Type":"ContainerDied","Data":"2b14b9323ec937e47f5c5885153c782fd00bfa909384625fec0699d79a82c4eb"} Apr 22 19:10:39.812578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.812575 2579 scope.go:117] "RemoveContainer" containerID="9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9" Apr 22 19:10:39.812848 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.812581 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb" Apr 22 19:10:39.820893 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.820870 2579 scope.go:117] "RemoveContainer" containerID="b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f" Apr 22 19:10:39.827688 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.827670 2579 scope.go:117] "RemoveContainer" containerID="540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85" Apr 22 19:10:39.835125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.835099 2579 scope.go:117] "RemoveContainer" containerID="79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11" Apr 22 19:10:39.835884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.835860 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:10:39.841215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.841188 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-5fdf4889b4-xmbrb"] Apr 22 19:10:39.842833 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.842814 2579 scope.go:117] "RemoveContainer" containerID="9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9" Apr 22 19:10:39.843117 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:10:39.843097 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9\": container with ID starting with 9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9 not found: ID does not exist" containerID="9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9" Apr 22 19:10:39.843199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843125 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9"} err="failed to get container status \"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9\": rpc error: code = NotFound desc = could not find container \"9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9\": container with ID starting with 9ce4e9a14932d2afd7cfbfb1272875ebd4ccd68a06ce1aa93933d0d8305d22a9 not found: ID does not exist" Apr 22 19:10:39.843199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843144 2579 scope.go:117] "RemoveContainer" containerID="b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f" Apr 22 19:10:39.843449 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:10:39.843424 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f\": container with ID starting with b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f not found: ID does not exist" containerID="b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f" Apr 22 19:10:39.843500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843462 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f"} err="failed to get container status \"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f\": rpc error: code = NotFound desc = could not find container \"b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f\": container with ID starting with b2a717b1073ef2cc24c3cc97232915c7aa38427f5beacc1709351bf9496a737f not found: ID does not exist" Apr 22 19:10:39.843500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843486 2579 scope.go:117] "RemoveContainer" containerID="540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85" Apr 22 19:10:39.843748 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:10:39.843730 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85\": container with ID starting with 540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85 not found: ID does not exist" containerID="540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85" Apr 22 19:10:39.843799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843754 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85"} err="failed to get container status \"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85\": rpc error: code = NotFound desc = could not find container \"540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85\": container with ID starting with 540c22e38ca0865b3479eb17df525e9d8b817ec61dbfe65c0d11ae60b2530a85 not found: ID does not exist" Apr 22 19:10:39.843799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843769 2579 scope.go:117] "RemoveContainer" containerID="79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11" Apr 22 19:10:39.843984 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:10:39.843964 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11\": container with ID starting with 79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11 not found: ID does not exist" containerID="79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11" Apr 22 19:10:39.844038 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:39.843994 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11"} err="failed to get container status \"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11\": rpc error: code = NotFound desc = could not find container \"79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11\": container with ID starting with 79bdb37341a24a89c761f67ee3e28ca67a2c8c755873f9d2702bd333d91b5c11 not found: ID does not exist" Apr 22 19:10:40.542721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:40.542686 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" path="/var/lib/kubelet/pods/30898fcc-0304-4ea5-96c5-1137194c01a9/volumes" Apr 22 19:10:45.790943 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:45.790905 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:10:55.790400 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:10:55.790362 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:11:05.790113 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:05.790069 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 22 19:11:15.791522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:15.791488 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:11:20.712797 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.712751 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:11:20.713420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.713102 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" containerID="cri-o://63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d" gracePeriod=30 Apr 22 19:11:20.713420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.713213 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kube-rbac-proxy" containerID="cri-o://0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a" gracePeriod=30 Apr 22 19:11:20.785312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785286 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:11:20.785565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785542 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" Apr 22 19:11:20.785565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785556 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" Apr 22 19:11:20.785565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785569 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-agent" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785576 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-agent" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785587 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="storage-initializer" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785593 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="storage-initializer" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785588 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.31:8643/healthz\": dial tcp 10.132.0.31:8643: connect: connection refused" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785603 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785679 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785738 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kube-rbac-proxy" Apr 22 19:11:20.785745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785746 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-agent" Apr 22 19:11:20.785986 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.785754 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="30898fcc-0304-4ea5-96c5-1137194c01a9" containerName="kserve-container" Apr 22 19:11:20.810107 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.810081 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:11:20.810237 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.810221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.813087 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.813063 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-predictor-serving-cert\"" Apr 22 19:11:20.813200 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.813110 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:11:20.893548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.893518 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.893548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.893550 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.893721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.893577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5zr9\" (UniqueName: \"kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.893721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.893650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.929924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.929895 2579 generic.go:358] "Generic (PLEG): container finished" podID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerID="0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a" exitCode=2 Apr 22 19:11:20.930051 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.929966 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerDied","Data":"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a"} Apr 22 19:11:20.994529 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.994447 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.994529 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.994498 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.994529 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.994526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s5zr9\" (UniqueName: \"kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.994705 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.994549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.994995 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.994975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.995130 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.995110 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:20.997087 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:20.997067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:21.004250 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:21.004230 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5zr9\" (UniqueName: \"kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9\") pod \"isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:21.120566 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:21.120528 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:21.243690 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:21.243668 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:11:21.246670 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:11:21.246603 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ea624a7_4b5a_4833_b97a_43a33d5c1c29.slice/crio-54f39212088ead2aefd3dafbaa6559eebb1e5ba93c609411886ea439c4037af8 WatchSource:0}: Error finding container 54f39212088ead2aefd3dafbaa6559eebb1e5ba93c609411886ea439c4037af8: Status 404 returned error can't find the container with id 54f39212088ead2aefd3dafbaa6559eebb1e5ba93c609411886ea439c4037af8 Apr 22 19:11:21.934345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:21.934307 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerStarted","Data":"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff"} Apr 22 19:11:21.934345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:21.934347 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerStarted","Data":"54f39212088ead2aefd3dafbaa6559eebb1e5ba93c609411886ea439c4037af8"} Apr 22 19:11:23.450230 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.450200 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:11:23.514086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.514019 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls\") pod \"d940a366-8bdf-4628-8f8e-3587f3aeb333\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " Apr 22 19:11:23.514086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.514083 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rf6k\" (UniqueName: \"kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k\") pod \"d940a366-8bdf-4628-8f8e-3587f3aeb333\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " Apr 22 19:11:23.514282 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.514118 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config\") pod \"d940a366-8bdf-4628-8f8e-3587f3aeb333\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " Apr 22 19:11:23.514282 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.514172 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location\") pod \"d940a366-8bdf-4628-8f8e-3587f3aeb333\" (UID: \"d940a366-8bdf-4628-8f8e-3587f3aeb333\") " Apr 22 19:11:23.514562 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.514532 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-kube-rbac-proxy-sar-config") pod "d940a366-8bdf-4628-8f8e-3587f3aeb333" (UID: "d940a366-8bdf-4628-8f8e-3587f3aeb333"). InnerVolumeSpecName "isvc-paddle-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:11:23.516075 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.516052 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d940a366-8bdf-4628-8f8e-3587f3aeb333" (UID: "d940a366-8bdf-4628-8f8e-3587f3aeb333"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:11:23.516193 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.516171 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k" (OuterVolumeSpecName: "kube-api-access-5rf6k") pod "d940a366-8bdf-4628-8f8e-3587f3aeb333" (UID: "d940a366-8bdf-4628-8f8e-3587f3aeb333"). InnerVolumeSpecName "kube-api-access-5rf6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:11:23.523168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.523144 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d940a366-8bdf-4628-8f8e-3587f3aeb333" (UID: "d940a366-8bdf-4628-8f8e-3587f3aeb333"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:11:23.615467 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.615437 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5rf6k\" (UniqueName: \"kubernetes.io/projected/d940a366-8bdf-4628-8f8e-3587f3aeb333-kube-api-access-5rf6k\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:11:23.615467 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.615465 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d940a366-8bdf-4628-8f8e-3587f3aeb333-isvc-paddle-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:11:23.615645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.615480 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d940a366-8bdf-4628-8f8e-3587f3aeb333-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:11:23.615645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.615492 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d940a366-8bdf-4628-8f8e-3587f3aeb333-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:11:23.941877 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.941842 2579 generic.go:358] "Generic (PLEG): container finished" podID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerID="63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d" exitCode=0 Apr 22 19:11:23.942069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.941933 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" Apr 22 19:11:23.942069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.941936 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerDied","Data":"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d"} Apr 22 19:11:23.942069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.941982 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q" event={"ID":"d940a366-8bdf-4628-8f8e-3587f3aeb333","Type":"ContainerDied","Data":"6c91dbb6991106f18d779632b33ce5362eb9a760093810640e72f7953579fab7"} Apr 22 19:11:23.942069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.942002 2579 scope.go:117] "RemoveContainer" containerID="0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a" Apr 22 19:11:23.949998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.949982 2579 scope.go:117] "RemoveContainer" containerID="63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d" Apr 22 19:11:23.956844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.956827 2579 scope.go:117] "RemoveContainer" containerID="c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021" Apr 22 19:11:23.964013 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.963993 2579 scope.go:117] "RemoveContainer" containerID="0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a" Apr 22 19:11:23.964311 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:11:23.964288 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a\": container with ID starting with 0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a not found: ID does not exist" containerID="0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a" Apr 22 19:11:23.964409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964326 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a"} err="failed to get container status \"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a\": rpc error: code = NotFound desc = could not find container \"0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a\": container with ID starting with 0ac4c892076af72fd7a15353a79dbdce061ff95598695e92ed72d08f5858a03a not found: ID does not exist" Apr 22 19:11:23.964409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964351 2579 scope.go:117] "RemoveContainer" containerID="63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d" Apr 22 19:11:23.964621 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:11:23.964601 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d\": container with ID starting with 63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d not found: ID does not exist" containerID="63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d" Apr 22 19:11:23.964694 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964630 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d"} err="failed to get container status \"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d\": rpc error: code = NotFound desc = could not find container \"63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d\": container with ID starting with 63841e86dc67baf98a7df1b18dc7689a0b99c84bde92f2c40051637a929e6a7d not found: ID does not exist" Apr 22 19:11:23.964694 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964653 2579 scope.go:117] "RemoveContainer" containerID="c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021" Apr 22 19:11:23.964792 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964767 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:11:23.964900 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:11:23.964883 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021\": container with ID starting with c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021 not found: ID does not exist" containerID="c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021" Apr 22 19:11:23.964935 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.964906 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021"} err="failed to get container status \"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021\": rpc error: code = NotFound desc = could not find container \"c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021\": container with ID starting with c1590ea562e0045ecb7db46ddedb59ab07b93aab84c3a5fafbabb128d9452021 not found: ID does not exist" Apr 22 19:11:23.970725 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:23.970702 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-6b8b7cfb4b-kwb5q"] Apr 22 19:11:24.543019 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:24.542987 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" path="/var/lib/kubelet/pods/d940a366-8bdf-4628-8f8e-3587f3aeb333/volumes" Apr 22 19:11:25.951903 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:25.951869 2579 generic.go:358] "Generic (PLEG): container finished" podID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerID="c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff" exitCode=0 Apr 22 19:11:25.952324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:25.951943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerDied","Data":"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff"} Apr 22 19:11:26.957133 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:26.957098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerStarted","Data":"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077"} Apr 22 19:11:26.957133 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:26.957138 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerStarted","Data":"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42"} Apr 22 19:11:26.957583 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:26.957402 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:26.978919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:26.978861 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podStartSLOduration=6.978840891 podStartE2EDuration="6.978840891s" podCreationTimestamp="2026-04-22 19:11:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:11:26.97659996 +0000 UTC m=+1498.976173869" watchObservedRunningTime="2026-04-22 19:11:26.978840891 +0000 UTC m=+1498.978414789" Apr 22 19:11:27.959938 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:27.959905 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:27.961311 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:27.961283 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:11:28.962821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:28.962789 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:11:33.966971 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:33.966941 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:11:33.967619 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:33.967581 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:11:43.968345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:43.968306 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:11:53.968531 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:11:53.968449 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:12:03.967679 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:03.967640 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:12:13.969015 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:13.968985 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:12:22.301458 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.301414 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:12:22.301850 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.301728 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" containerID="cri-o://6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42" gracePeriod=30 Apr 22 19:12:22.301924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.301845 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kube-rbac-proxy" containerID="cri-o://1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077" gracePeriod=30 Apr 22 19:12:22.376248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376213 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:12:22.376495 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376483 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" Apr 22 19:12:22.376554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376496 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" Apr 22 19:12:22.376554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376508 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="storage-initializer" Apr 22 19:12:22.376554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376514 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="storage-initializer" Apr 22 19:12:22.376554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376533 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kube-rbac-proxy" Apr 22 19:12:22.376554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376539 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kube-rbac-proxy" Apr 22 19:12:22.376725 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376583 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kube-rbac-proxy" Apr 22 19:12:22.376725 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.376590 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d940a366-8bdf-4628-8f8e-3587f3aeb333" containerName="kserve-container" Apr 22 19:12:22.379617 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.379602 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.382723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.382702 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-predictor-serving-cert\"" Apr 22 19:12:22.382723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.382705 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 22 19:12:22.388849 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.388825 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:12:22.468305 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.468251 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.468478 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.468365 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvzcg\" (UniqueName: \"kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.468478 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.468425 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.468478 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.468450 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.569604 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.569514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.569604 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.569568 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jvzcg\" (UniqueName: \"kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.569604 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.569591 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.569884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.569612 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.569958 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.569933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.570187 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.570167 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.571959 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.571933 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.578939 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.578917 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvzcg\" (UniqueName: \"kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg\") pod \"isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.690659 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.690605 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:22.812750 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.812720 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:12:22.815632 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:12:22.815602 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9114b27_d90f_4035_9fa8_c7a0de3f6ebc.slice/crio-d5cc665e0c39b3dbfac87042491551f7e23998de07743d62f2661510fbe475f5 WatchSource:0}: Error finding container d5cc665e0c39b3dbfac87042491551f7e23998de07743d62f2661510fbe475f5: Status 404 returned error can't find the container with id d5cc665e0c39b3dbfac87042491551f7e23998de07743d62f2661510fbe475f5 Apr 22 19:12:22.817351 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:22.817335 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:12:23.116823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.116729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerStarted","Data":"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121"} Apr 22 19:12:23.116823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.116774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerStarted","Data":"d5cc665e0c39b3dbfac87042491551f7e23998de07743d62f2661510fbe475f5"} Apr 22 19:12:23.118403 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.118372 2579 generic.go:358] "Generic (PLEG): container finished" podID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerID="1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077" exitCode=2 Apr 22 19:12:23.118508 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.118426 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerDied","Data":"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077"} Apr 22 19:12:23.964039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.964000 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.32:8643/healthz\": dial tcp 10.132.0.32:8643: connect: connection refused" Apr 22 19:12:23.968388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:23.968366 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 22 19:12:25.035378 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.035345 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:12:25.090678 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.090641 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") pod \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " Apr 22 19:12:25.090678 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.090680 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s5zr9\" (UniqueName: \"kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9\") pod \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " Apr 22 19:12:25.090873 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.090724 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls\") pod \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " Apr 22 19:12:25.090873 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.090765 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location\") pod \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\" (UID: \"3ea624a7-4b5a-4833-b97a-43a33d5c1c29\") " Apr 22 19:12:25.091064 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.091035 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-runtime-kube-rbac-proxy-sar-config") pod "3ea624a7-4b5a-4833-b97a-43a33d5c1c29" (UID: "3ea624a7-4b5a-4833-b97a-43a33d5c1c29"). InnerVolumeSpecName "isvc-paddle-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:12:25.092796 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.092766 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9" (OuterVolumeSpecName: "kube-api-access-s5zr9") pod "3ea624a7-4b5a-4833-b97a-43a33d5c1c29" (UID: "3ea624a7-4b5a-4833-b97a-43a33d5c1c29"). InnerVolumeSpecName "kube-api-access-s5zr9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:12:25.092796 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.092777 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3ea624a7-4b5a-4833-b97a-43a33d5c1c29" (UID: "3ea624a7-4b5a-4833-b97a-43a33d5c1c29"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:12:25.100673 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.100648 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ea624a7-4b5a-4833-b97a-43a33d5c1c29" (UID: "3ea624a7-4b5a-4833-b97a-43a33d5c1c29"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:12:25.125630 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.125603 2579 generic.go:358] "Generic (PLEG): container finished" podID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerID="6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42" exitCode=0 Apr 22 19:12:25.125740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.125639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerDied","Data":"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42"} Apr 22 19:12:25.125740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.125663 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" event={"ID":"3ea624a7-4b5a-4833-b97a-43a33d5c1c29","Type":"ContainerDied","Data":"54f39212088ead2aefd3dafbaa6559eebb1e5ba93c609411886ea439c4037af8"} Apr 22 19:12:25.125740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.125678 2579 scope.go:117] "RemoveContainer" containerID="1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077" Apr 22 19:12:25.125740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.125678 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj" Apr 22 19:12:25.134130 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.134107 2579 scope.go:117] "RemoveContainer" containerID="6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42" Apr 22 19:12:25.140979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.140959 2579 scope.go:117] "RemoveContainer" containerID="c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff" Apr 22 19:12:25.147248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147232 2579 scope.go:117] "RemoveContainer" containerID="1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077" Apr 22 19:12:25.147512 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:12:25.147497 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077\": container with ID starting with 1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077 not found: ID does not exist" containerID="1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077" Apr 22 19:12:25.147548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147520 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077"} err="failed to get container status \"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077\": rpc error: code = NotFound desc = could not find container \"1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077\": container with ID starting with 1607c61669222eaabd3055162acc5ef2a581f821513d49d4ec776104093b5077 not found: ID does not exist" Apr 22 19:12:25.147548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147538 2579 scope.go:117] "RemoveContainer" containerID="6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42" Apr 22 19:12:25.147727 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:12:25.147713 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42\": container with ID starting with 6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42 not found: ID does not exist" containerID="6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42" Apr 22 19:12:25.147806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147732 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42"} err="failed to get container status \"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42\": rpc error: code = NotFound desc = could not find container \"6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42\": container with ID starting with 6b1680405fdebd5b23ee7f089ae87992a0932442867b28558d7fd46b22ca7b42 not found: ID does not exist" Apr 22 19:12:25.147806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147745 2579 scope.go:117] "RemoveContainer" containerID="c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff" Apr 22 19:12:25.147927 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:12:25.147901 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff\": container with ID starting with c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff not found: ID does not exist" containerID="c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff" Apr 22 19:12:25.147967 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.147934 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff"} err="failed to get container status \"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff\": rpc error: code = NotFound desc = could not find container \"c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff\": container with ID starting with c430b36848ec021fd04099bb88158966231c2a936d7358dfe0ee9fd6df86b1ff not found: ID does not exist" Apr 22 19:12:25.150325 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.150300 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:12:25.155126 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.155107 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-7f4d4f9dc8-frxjj"] Apr 22 19:12:25.191889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.191862 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:12:25.191973 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.191887 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-isvc-paddle-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:12:25.191973 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.191911 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s5zr9\" (UniqueName: \"kubernetes.io/projected/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-kube-api-access-s5zr9\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:12:25.191973 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:25.191920 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3ea624a7-4b5a-4833-b97a-43a33d5c1c29-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:12:26.543922 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:26.543886 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" path="/var/lib/kubelet/pods/3ea624a7-4b5a-4833-b97a-43a33d5c1c29/volumes" Apr 22 19:12:28.135407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:28.135370 2579 generic.go:358] "Generic (PLEG): container finished" podID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerID="cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121" exitCode=0 Apr 22 19:12:28.135806 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:28.135444 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerDied","Data":"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121"} Apr 22 19:12:29.145421 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.145385 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerStarted","Data":"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e"} Apr 22 19:12:29.145421 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.145427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerStarted","Data":"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e"} Apr 22 19:12:29.145815 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.145727 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:29.145868 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.145851 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:29.147080 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.147054 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:12:29.165365 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:29.165325 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podStartSLOduration=7.165314649 podStartE2EDuration="7.165314649s" podCreationTimestamp="2026-04-22 19:12:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:12:29.163940769 +0000 UTC m=+1561.163514691" watchObservedRunningTime="2026-04-22 19:12:29.165314649 +0000 UTC m=+1561.164888635" Apr 22 19:12:30.148280 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:30.148234 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:12:35.153689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:35.153660 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:12:35.154323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:35.154294 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:12:45.155004 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:45.154963 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:12:55.154728 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:12:55.154689 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:13:05.155135 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:05.155099 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:13:15.154398 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:15.154370 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:13:24.326359 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.326328 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:13:24.326826 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.326715 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" containerID="cri-o://a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e" gracePeriod=30 Apr 22 19:13:24.327612 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.327106 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kube-rbac-proxy" containerID="cri-o://776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e" gracePeriod=30 Apr 22 19:13:24.396666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396633 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:13:24.396942 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396918 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kube-rbac-proxy" Apr 22 19:13:24.396942 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396933 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kube-rbac-proxy" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396947 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="storage-initializer" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396953 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="storage-initializer" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396967 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.396976 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.397017 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kube-rbac-proxy" Apr 22 19:13:24.397026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.397024 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ea624a7-4b5a-4833-b97a-43a33d5c1c29" containerName="kserve-container" Apr 22 19:13:24.400052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.400035 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.402957 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.402940 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-kube-rbac-proxy-sar-config\"" Apr 22 19:13:24.403047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.402943 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-predictor-serving-cert\"" Apr 22 19:13:24.410933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.410899 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:13:24.538324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.538292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.538486 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.538369 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pfv\" (UniqueName: \"kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.538486 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.538421 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.538486 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.538466 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.639829 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.639734 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pfv\" (UniqueName: \"kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.639829 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.639802 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.640054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.639833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.640054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.639865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.640321 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.640297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.640684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.640660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.642339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.642315 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.649436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.649410 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pfv\" (UniqueName: \"kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv\") pod \"isvc-pmml-predictor-8bb578669-29k6m\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.710822 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.710787 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:24.828638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:24.828608 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:13:24.832120 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:13:24.832094 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ce0e8ca_d5a2_4278_bfff_8af6ab08a88a.slice/crio-889b6d3ef79fdbe303ccfdbf28dfad4806f5aef2f0ad1b5cbea15dea5f2bddf4 WatchSource:0}: Error finding container 889b6d3ef79fdbe303ccfdbf28dfad4806f5aef2f0ad1b5cbea15dea5f2bddf4: Status 404 returned error can't find the container with id 889b6d3ef79fdbe303ccfdbf28dfad4806f5aef2f0ad1b5cbea15dea5f2bddf4 Apr 22 19:13:25.149284 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.149219 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.33:8643/healthz\": dial tcp 10.132.0.33:8643: connect: connection refused" Apr 22 19:13:25.154643 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.154613 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.33:8080: connect: connection refused" Apr 22 19:13:25.299744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.299706 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerStarted","Data":"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe"} Apr 22 19:13:25.299744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.299747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerStarted","Data":"889b6d3ef79fdbe303ccfdbf28dfad4806f5aef2f0ad1b5cbea15dea5f2bddf4"} Apr 22 19:13:25.301757 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.301733 2579 generic.go:358] "Generic (PLEG): container finished" podID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerID="776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e" exitCode=2 Apr 22 19:13:25.301861 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:25.301788 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerDied","Data":"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e"} Apr 22 19:13:27.063672 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.063650 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:13:27.159628 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.159542 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") pod \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " Apr 22 19:13:27.159628 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.159623 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location\") pod \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " Apr 22 19:13:27.159823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.159657 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls\") pod \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " Apr 22 19:13:27.159823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.159684 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvzcg\" (UniqueName: \"kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg\") pod \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\" (UID: \"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc\") " Apr 22 19:13:27.159985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.159956 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config") pod "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" (UID: "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc"). InnerVolumeSpecName "isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:13:27.161755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.161722 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" (UID: "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:13:27.161878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.161787 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg" (OuterVolumeSpecName: "kube-api-access-jvzcg") pod "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" (UID: "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc"). InnerVolumeSpecName "kube-api-access-jvzcg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:13:27.169476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.169449 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" (UID: "d9114b27-d90f-4035-9fa8-c7a0de3f6ebc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:13:27.260563 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.260528 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:13:27.260563 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.260561 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:13:27.260563 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.260571 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jvzcg\" (UniqueName: \"kubernetes.io/projected/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-kube-api-access-jvzcg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:13:27.260751 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.260580 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc-isvc-paddle-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:13:27.309163 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.309122 2579 generic.go:358] "Generic (PLEG): container finished" podID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerID="a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e" exitCode=0 Apr 22 19:13:27.309323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.309208 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" Apr 22 19:13:27.309323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.309202 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerDied","Data":"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e"} Apr 22 19:13:27.309402 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.309323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7" event={"ID":"d9114b27-d90f-4035-9fa8-c7a0de3f6ebc","Type":"ContainerDied","Data":"d5cc665e0c39b3dbfac87042491551f7e23998de07743d62f2661510fbe475f5"} Apr 22 19:13:27.309402 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.309341 2579 scope.go:117] "RemoveContainer" containerID="776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e" Apr 22 19:13:27.317422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.317407 2579 scope.go:117] "RemoveContainer" containerID="a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e" Apr 22 19:13:27.324193 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.324173 2579 scope.go:117] "RemoveContainer" containerID="cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121" Apr 22 19:13:27.330769 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.330751 2579 scope.go:117] "RemoveContainer" containerID="776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e" Apr 22 19:13:27.331046 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:13:27.331013 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e\": container with ID starting with 776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e not found: ID does not exist" containerID="776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e" Apr 22 19:13:27.331175 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331058 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e"} err="failed to get container status \"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e\": rpc error: code = NotFound desc = could not find container \"776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e\": container with ID starting with 776c758be9503da04d968f7aa10db83b3459f11d6e4576879c836db9195ba33e not found: ID does not exist" Apr 22 19:13:27.331175 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331099 2579 scope.go:117] "RemoveContainer" containerID="a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e" Apr 22 19:13:27.331420 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:13:27.331351 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e\": container with ID starting with a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e not found: ID does not exist" containerID="a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e" Apr 22 19:13:27.331420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331389 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e"} err="failed to get container status \"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e\": rpc error: code = NotFound desc = could not find container \"a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e\": container with ID starting with a5b1981717e463ed8c2ee68e0a4b592251b33d87974a041d1c8ee63505b3886e not found: ID does not exist" Apr 22 19:13:27.331420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331411 2579 scope.go:117] "RemoveContainer" containerID="cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121" Apr 22 19:13:27.331686 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:13:27.331666 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121\": container with ID starting with cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121 not found: ID does not exist" containerID="cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121" Apr 22 19:13:27.331751 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331692 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121"} err="failed to get container status \"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121\": rpc error: code = NotFound desc = could not find container \"cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121\": container with ID starting with cf721c2a1ea0c2402631e71688ff2755dc25f1cafb8785f84e8dc89c66ddb121 not found: ID does not exist" Apr 22 19:13:27.331795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.331747 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:13:27.335294 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:27.335276 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7dbd59854-jxtc7"] Apr 22 19:13:28.543519 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:28.543481 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" path="/var/lib/kubelet/pods/d9114b27-d90f-4035-9fa8-c7a0de3f6ebc/volumes" Apr 22 19:13:29.316135 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:29.316102 2579 generic.go:358] "Generic (PLEG): container finished" podID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerID="f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe" exitCode=0 Apr 22 19:13:29.316335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:29.316178 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerDied","Data":"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe"} Apr 22 19:13:36.339171 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:36.339143 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerStarted","Data":"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3"} Apr 22 19:13:36.339586 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:36.339186 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerStarted","Data":"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e"} Apr 22 19:13:36.339586 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:36.339502 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:36.361689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:36.361646 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podStartSLOduration=5.48831986 podStartE2EDuration="12.361633329s" podCreationTimestamp="2026-04-22 19:13:24 +0000 UTC" firstStartedPulling="2026-04-22 19:13:29.317239546 +0000 UTC m=+1621.316813419" lastFinishedPulling="2026-04-22 19:13:36.190553 +0000 UTC m=+1628.190126888" observedRunningTime="2026-04-22 19:13:36.359963752 +0000 UTC m=+1628.359537650" watchObservedRunningTime="2026-04-22 19:13:36.361633329 +0000 UTC m=+1628.361207226" Apr 22 19:13:37.341754 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:37.341723 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:37.342994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:37.342970 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:13:38.344342 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:38.344290 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:13:43.349213 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:43.349180 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:13:43.349794 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:43.349765 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:13:53.349759 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:13:53.349715 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:03.350495 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:03.350454 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:13.350474 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:13.350428 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:23.350435 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:23.350392 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:33.349825 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:33.349776 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:43.350105 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:43.350068 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 22 19:14:53.353500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:53.353422 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:14:55.910503 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:55.910463 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:14:55.910890 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:55.910833 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" containerID="cri-o://5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e" gracePeriod=30 Apr 22 19:14:55.911009 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:55.910889 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kube-rbac-proxy" containerID="cri-o://eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3" gracePeriod=30 Apr 22 19:14:56.001450 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001420 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:14:56.001830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001814 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" Apr 22 19:14:56.001896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001834 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" Apr 22 19:14:56.001896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001849 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kube-rbac-proxy" Apr 22 19:14:56.001896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001858 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kube-rbac-proxy" Apr 22 19:14:56.001896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001873 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="storage-initializer" Apr 22 19:14:56.001896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001885 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="storage-initializer" Apr 22 19:14:56.002119 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001968 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kube-rbac-proxy" Apr 22 19:14:56.002119 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.001982 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9114b27-d90f-4035-9fa8-c7a0de3f6ebc" containerName="kserve-container" Apr 22 19:14:56.005554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.005537 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.008102 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.008073 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:14:56.008208 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.008103 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-runtime-predictor-serving-cert\"" Apr 22 19:14:56.011769 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.011745 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:14:56.024110 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.024090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.024217 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.024124 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.024217 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.024148 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78hts\" (UniqueName: \"kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.024352 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.024234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.125383 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.125344 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.125559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.125431 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.125559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.125451 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.125559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.125473 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-78hts\" (UniqueName: \"kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.125945 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.125923 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.126057 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.126032 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.127973 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.127950 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.134660 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.134634 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-78hts\" (UniqueName: \"kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts\") pod \"isvc-pmml-runtime-predictor-67bc544947-4278x\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.316811 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.316777 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:14:56.434863 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.434836 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:14:56.437475 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:14:56.437451 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41d8fc58_8930_455b_b0df_08f473e37944.slice/crio-53f4c8ec6e30b672ea3433ad88a9e364567983800ffdadac02c06e22c6c3fefa WatchSource:0}: Error finding container 53f4c8ec6e30b672ea3433ad88a9e364567983800ffdadac02c06e22c6c3fefa: Status 404 returned error can't find the container with id 53f4c8ec6e30b672ea3433ad88a9e364567983800ffdadac02c06e22c6c3fefa Apr 22 19:14:56.552723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.552689 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerStarted","Data":"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba"} Apr 22 19:14:56.552723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.552728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerStarted","Data":"53f4c8ec6e30b672ea3433ad88a9e364567983800ffdadac02c06e22c6c3fefa"} Apr 22 19:14:56.554616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.554585 2579 generic.go:358] "Generic (PLEG): container finished" podID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerID="eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3" exitCode=2 Apr 22 19:14:56.554727 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:56.554660 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerDied","Data":"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3"} Apr 22 19:14:58.344717 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:58.344675 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.34:8643/healthz\": dial tcp 10.132.0.34:8643: connect: connection refused" Apr 22 19:14:59.250133 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.250111 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:14:59.350639 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.350560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7pfv\" (UniqueName: \"kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv\") pod \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " Apr 22 19:14:59.350639 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.350628 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location\") pod \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " Apr 22 19:14:59.351046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.350660 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config\") pod \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " Apr 22 19:14:59.351046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.350718 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls\") pod \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\" (UID: \"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a\") " Apr 22 19:14:59.351046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.350922 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" (UID: "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:14:59.351046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.351019 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-kube-rbac-proxy-sar-config") pod "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" (UID: "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a"). InnerVolumeSpecName "isvc-pmml-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:14:59.352650 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.352629 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv" (OuterVolumeSpecName: "kube-api-access-l7pfv") pod "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" (UID: "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a"). InnerVolumeSpecName "kube-api-access-l7pfv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:14:59.352746 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.352707 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" (UID: "4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:14:59.451932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.451899 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:14:59.451932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.451927 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-isvc-pmml-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:14:59.452156 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.451939 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:14:59.452156 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.451950 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l7pfv\" (UniqueName: \"kubernetes.io/projected/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a-kube-api-access-l7pfv\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:14:59.566745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.566712 2579 generic.go:358] "Generic (PLEG): container finished" podID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerID="5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e" exitCode=0 Apr 22 19:14:59.566898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.566785 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" Apr 22 19:14:59.566898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.566784 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerDied","Data":"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e"} Apr 22 19:14:59.567022 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.566923 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m" event={"ID":"4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a","Type":"ContainerDied","Data":"889b6d3ef79fdbe303ccfdbf28dfad4806f5aef2f0ad1b5cbea15dea5f2bddf4"} Apr 22 19:14:59.567022 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.566952 2579 scope.go:117] "RemoveContainer" containerID="eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3" Apr 22 19:14:59.574680 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.574661 2579 scope.go:117] "RemoveContainer" containerID="5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e" Apr 22 19:14:59.581632 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.581617 2579 scope.go:117] "RemoveContainer" containerID="f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe" Apr 22 19:14:59.587814 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.587798 2579 scope.go:117] "RemoveContainer" containerID="eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3" Apr 22 19:14:59.588049 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:14:59.588031 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3\": container with ID starting with eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3 not found: ID does not exist" containerID="eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3" Apr 22 19:14:59.588111 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.588054 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3"} err="failed to get container status \"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3\": rpc error: code = NotFound desc = could not find container \"eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3\": container with ID starting with eb085235fb837c4b3deec08460ca9fb9fba598e16fc6dbf0ba73d63460c95dc3 not found: ID does not exist" Apr 22 19:14:59.588111 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.588070 2579 scope.go:117] "RemoveContainer" containerID="5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e" Apr 22 19:14:59.588297 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:14:59.588279 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e\": container with ID starting with 5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e not found: ID does not exist" containerID="5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e" Apr 22 19:14:59.588364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.588302 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e"} err="failed to get container status \"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e\": rpc error: code = NotFound desc = could not find container \"5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e\": container with ID starting with 5d25419b2fbec89de41ca6fe9c61ddb430716f360f4ecac2ac90886a3799cc7e not found: ID does not exist" Apr 22 19:14:59.588364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.588318 2579 scope.go:117] "RemoveContainer" containerID="f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe" Apr 22 19:14:59.588501 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:14:59.588486 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe\": container with ID starting with f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe not found: ID does not exist" containerID="f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe" Apr 22 19:14:59.588540 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.588504 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe"} err="failed to get container status \"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe\": rpc error: code = NotFound desc = could not find container \"f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe\": container with ID starting with f19c041aed6b93259e912ef74be76b077d9bda4fdbbf7467154be9744b7e6afe not found: ID does not exist" Apr 22 19:14:59.592434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.592416 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:14:59.598713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:14:59.598694 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-8bb578669-29k6m"] Apr 22 19:15:00.544068 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:00.543998 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" path="/var/lib/kubelet/pods/4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a/volumes" Apr 22 19:15:00.570183 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:00.570154 2579 generic.go:358] "Generic (PLEG): container finished" podID="41d8fc58-8930-455b-b0df-08f473e37944" containerID="15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba" exitCode=0 Apr 22 19:15:00.570346 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:00.570232 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerDied","Data":"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba"} Apr 22 19:15:01.575320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.575287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerStarted","Data":"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5"} Apr 22 19:15:01.575320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.575324 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerStarted","Data":"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69"} Apr 22 19:15:01.575751 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.575607 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:15:01.575751 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.575740 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:15:01.576812 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.576788 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:01.595423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:01.595383 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podStartSLOduration=6.595371332 podStartE2EDuration="6.595371332s" podCreationTimestamp="2026-04-22 19:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:15:01.593495835 +0000 UTC m=+1713.593069731" watchObservedRunningTime="2026-04-22 19:15:01.595371332 +0000 UTC m=+1713.594945296" Apr 22 19:15:02.578054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:02.578011 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:07.583476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:07.583446 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:15:07.584013 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:07.583988 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:17.584944 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:17.584906 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:27.584580 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:27.584539 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:37.584384 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:37.584345 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:47.584137 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:47.584099 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:15:57.584701 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:15:57.584662 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:16:07.584378 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:07.584338 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:16:13.539132 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:13.539086 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 22 19:16:23.539902 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:23.539871 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:16:27.206116 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.206080 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:16:27.206546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.206417 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" containerID="cri-o://f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69" gracePeriod=30 Apr 22 19:16:27.206546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.206477 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kube-rbac-proxy" containerID="cri-o://9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5" gracePeriod=30 Apr 22 19:16:27.297964 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.297930 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:16:27.298207 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298196 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" Apr 22 19:16:27.298207 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298208 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298227 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kube-rbac-proxy" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298233 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kube-rbac-proxy" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298242 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="storage-initializer" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298248 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="storage-initializer" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298316 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kube-rbac-proxy" Apr 22 19:16:27.298324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.298326 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4ce0e8ca-d5a2-4278-bfff-8af6ab08a88a" containerName="kserve-container" Apr 22 19:16:27.301467 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.301449 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.304003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.303978 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\"" Apr 22 19:16:27.304334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.304317 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-pmml-v2-kserve-predictor-serving-cert\"" Apr 22 19:16:27.310833 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.310786 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:16:27.402199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.402167 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8stm5\" (UniqueName: \"kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.402384 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.402225 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.402384 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.402306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.402384 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.402332 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.503398 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8stm5\" (UniqueName: \"kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.503398 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.503614 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503416 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.503614 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503440 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.503614 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:27.503563 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 22 19:16:27.503722 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:27.503630 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls podName:42e6a5a5-4a6e-460c-8f0d-18429a9918fc nodeName:}" failed. No retries permitted until 2026-04-22 19:16:28.003613935 +0000 UTC m=+1800.003187809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 22 19:16:27.503841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503823 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.504001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.503983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.512257 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.512236 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8stm5\" (UniqueName: \"kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:27.578791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.578752 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.35:8643/healthz\": dial tcp 10.132.0.35:8643: connect: connection refused" Apr 22 19:16:27.821448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.821361 2579 generic.go:358] "Generic (PLEG): container finished" podID="41d8fc58-8930-455b-b0df-08f473e37944" containerID="9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5" exitCode=2 Apr 22 19:16:27.821625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:27.821443 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerDied","Data":"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5"} Apr 22 19:16:28.006654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:28.006600 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:28.006843 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:28.006760 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-serving-cert: secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 22 19:16:28.006843 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:28.006827 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls podName:42e6a5a5-4a6e-460c-8f0d-18429a9918fc nodeName:}" failed. No retries permitted until 2026-04-22 19:16:29.006809992 +0000 UTC m=+1801.006383867 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls") pod "isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc") : secret "isvc-pmml-v2-kserve-predictor-serving-cert" not found Apr 22 19:16:29.013819 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.013790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:29.016077 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.016054 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:29.112201 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.112162 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:29.225866 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.225842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:16:29.228060 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:16:29.228029 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42e6a5a5_4a6e_460c_8f0d_18429a9918fc.slice/crio-7adcdd365c403cd648d2fe8a67de64c15186aa74ea906eb242c6c2d5829370ba WatchSource:0}: Error finding container 7adcdd365c403cd648d2fe8a67de64c15186aa74ea906eb242c6c2d5829370ba: Status 404 returned error can't find the container with id 7adcdd365c403cd648d2fe8a67de64c15186aa74ea906eb242c6c2d5829370ba Apr 22 19:16:29.828317 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.828249 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerStarted","Data":"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59"} Apr 22 19:16:29.828317 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:29.828321 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerStarted","Data":"7adcdd365c403cd648d2fe8a67de64c15186aa74ea906eb242c6c2d5829370ba"} Apr 22 19:16:30.749548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.749525 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:16:30.832336 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.832216 2579 generic.go:358] "Generic (PLEG): container finished" podID="41d8fc58-8930-455b-b0df-08f473e37944" containerID="f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69" exitCode=0 Apr 22 19:16:30.832336 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.832294 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerDied","Data":"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69"} Apr 22 19:16:30.832526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.832344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" event={"ID":"41d8fc58-8930-455b-b0df-08f473e37944","Type":"ContainerDied","Data":"53f4c8ec6e30b672ea3433ad88a9e364567983800ffdadac02c06e22c6c3fefa"} Apr 22 19:16:30.832526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.832346 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x" Apr 22 19:16:30.832526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.832370 2579 scope.go:117] "RemoveContainer" containerID="9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5" Apr 22 19:16:30.833071 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833051 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location\") pod \"41d8fc58-8930-455b-b0df-08f473e37944\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " Apr 22 19:16:30.833195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833085 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78hts\" (UniqueName: \"kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts\") pod \"41d8fc58-8930-455b-b0df-08f473e37944\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " Apr 22 19:16:30.833195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833141 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls\") pod \"41d8fc58-8930-455b-b0df-08f473e37944\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " Apr 22 19:16:30.833360 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833312 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") pod \"41d8fc58-8930-455b-b0df-08f473e37944\" (UID: \"41d8fc58-8930-455b-b0df-08f473e37944\") " Apr 22 19:16:30.833490 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833403 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41d8fc58-8930-455b-b0df-08f473e37944" (UID: "41d8fc58-8930-455b-b0df-08f473e37944"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:16:30.833562 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833542 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41d8fc58-8930-455b-b0df-08f473e37944-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:16:30.833653 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.833630 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-runtime-kube-rbac-proxy-sar-config") pod "41d8fc58-8930-455b-b0df-08f473e37944" (UID: "41d8fc58-8930-455b-b0df-08f473e37944"). InnerVolumeSpecName "isvc-pmml-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:16:30.835319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.835292 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "41d8fc58-8930-455b-b0df-08f473e37944" (UID: "41d8fc58-8930-455b-b0df-08f473e37944"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:16:30.835319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.835307 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts" (OuterVolumeSpecName: "kube-api-access-78hts") pod "41d8fc58-8930-455b-b0df-08f473e37944" (UID: "41d8fc58-8930-455b-b0df-08f473e37944"). InnerVolumeSpecName "kube-api-access-78hts". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:16:30.848461 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.848440 2579 scope.go:117] "RemoveContainer" containerID="f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69" Apr 22 19:16:30.855098 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.855077 2579 scope.go:117] "RemoveContainer" containerID="15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba" Apr 22 19:16:30.861477 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.861458 2579 scope.go:117] "RemoveContainer" containerID="9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5" Apr 22 19:16:30.861745 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:30.861727 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5\": container with ID starting with 9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5 not found: ID does not exist" containerID="9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5" Apr 22 19:16:30.861792 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.861754 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5"} err="failed to get container status \"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5\": rpc error: code = NotFound desc = could not find container \"9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5\": container with ID starting with 9340617d4fe3fb5330852eeda9f01d66f1064eed65fbb6f490b1f3cdd5207bc5 not found: ID does not exist" Apr 22 19:16:30.861792 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.861773 2579 scope.go:117] "RemoveContainer" containerID="f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69" Apr 22 19:16:30.861971 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:30.861957 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69\": container with ID starting with f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69 not found: ID does not exist" containerID="f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69" Apr 22 19:16:30.862016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.861974 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69"} err="failed to get container status \"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69\": rpc error: code = NotFound desc = could not find container \"f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69\": container with ID starting with f1a4a335130fc42ba2cc6d63807086db220e8d226c14b4a9b2754b05a099de69 not found: ID does not exist" Apr 22 19:16:30.862016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.861987 2579 scope.go:117] "RemoveContainer" containerID="15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba" Apr 22 19:16:30.862168 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:16:30.862154 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba\": container with ID starting with 15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba not found: ID does not exist" containerID="15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba" Apr 22 19:16:30.862207 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.862172 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba"} err="failed to get container status \"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba\": rpc error: code = NotFound desc = could not find container \"15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba\": container with ID starting with 15e8a881efe54bb5beb763ac87cae0832e14d2cadaa4fc19bf59575e2cd2f1ba not found: ID does not exist" Apr 22 19:16:30.934071 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.934026 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/41d8fc58-8930-455b-b0df-08f473e37944-isvc-pmml-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:16:30.934071 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.934064 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-78hts\" (UniqueName: \"kubernetes.io/projected/41d8fc58-8930-455b-b0df-08f473e37944-kube-api-access-78hts\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:16:30.934344 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:30.934078 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/41d8fc58-8930-455b-b0df-08f473e37944-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:16:31.156460 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:31.156427 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:16:31.159690 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:31.159665 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-67bc544947-4278x"] Apr 22 19:16:32.543049 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:32.543011 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41d8fc58-8930-455b-b0df-08f473e37944" path="/var/lib/kubelet/pods/41d8fc58-8930-455b-b0df-08f473e37944/volumes" Apr 22 19:16:33.846337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:33.846232 2579 generic.go:358] "Generic (PLEG): container finished" podID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerID="8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59" exitCode=0 Apr 22 19:16:33.846337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:33.846304 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerDied","Data":"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59"} Apr 22 19:16:34.851143 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.851107 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerStarted","Data":"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c"} Apr 22 19:16:34.851143 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.851148 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerStarted","Data":"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd"} Apr 22 19:16:34.851610 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.851410 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:34.851610 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.851434 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:34.852670 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.852645 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:16:34.871583 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:34.871535 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podStartSLOduration=7.871523784 podStartE2EDuration="7.871523784s" podCreationTimestamp="2026-04-22 19:16:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:16:34.869111667 +0000 UTC m=+1806.868685565" watchObservedRunningTime="2026-04-22 19:16:34.871523784 +0000 UTC m=+1806.871097681" Apr 22 19:16:35.854172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:35.854136 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:16:40.858016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:40.857990 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:16:40.858434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:40.858416 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:16:50.859295 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:16:50.859235 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:00.858995 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:00.858955 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:10.858867 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:10.858822 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:20.858453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:20.858411 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:30.859105 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:30.859010 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:40.858481 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:40.858444 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:17:50.858998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:50.858909 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:17:58.420831 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.420798 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:17:58.421394 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.421196 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" containerID="cri-o://c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd" gracePeriod=30 Apr 22 19:17:58.421394 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.421288 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kube-rbac-proxy" containerID="cri-o://e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c" gracePeriod=30 Apr 22 19:17:58.514727 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514695 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:17:58.514968 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514956 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="storage-initializer" Apr 22 19:17:58.514968 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514968 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="storage-initializer" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514977 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514983 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.514998 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kube-rbac-proxy" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.515004 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kube-rbac-proxy" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.515057 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kserve-container" Apr 22 19:17:58.515072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.515063 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="41d8fc58-8930-455b-b0df-08f473e37944" containerName="kube-rbac-proxy" Apr 22 19:17:58.519168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.519152 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.522463 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.522442 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-dc4537-kube-rbac-proxy-sar-config\"" Apr 22 19:17:58.522583 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.522565 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-primary-dc4537-predictor-serving-cert\"" Apr 22 19:17:58.529899 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.529876 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:17:58.581391 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.581362 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.581391 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.581398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.581596 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.581464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-primary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.581596 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.581483 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w49vq\" (UniqueName: \"kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.682823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.682791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.683005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.682829 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.683005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.682869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-primary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.683005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.682888 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w49vq\" (UniqueName: \"kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.683303 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.683284 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.683666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.683644 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-primary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.685147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.685128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.693376 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.693357 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w49vq\" (UniqueName: \"kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq\") pod \"isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.829602 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.829568 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:17:58.950305 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.950257 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:17:58.952878 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:17:58.952849 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fd9e652_319d_4a2f_be89_9a5069ebc52c.slice/crio-717ce68d707f5631d19d0d32b135c24dec40c5b02f9509307389162aabd7d7e3 WatchSource:0}: Error finding container 717ce68d707f5631d19d0d32b135c24dec40c5b02f9509307389162aabd7d7e3: Status 404 returned error can't find the container with id 717ce68d707f5631d19d0d32b135c24dec40c5b02f9509307389162aabd7d7e3 Apr 22 19:17:58.954650 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:58.954635 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:17:59.083929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:59.083892 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerStarted","Data":"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d"} Apr 22 19:17:59.083929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:59.083932 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerStarted","Data":"717ce68d707f5631d19d0d32b135c24dec40c5b02f9509307389162aabd7d7e3"} Apr 22 19:17:59.085885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:59.085855 2579 generic.go:358] "Generic (PLEG): container finished" podID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerID="e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c" exitCode=2 Apr 22 19:17:59.085985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:17:59.085916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerDied","Data":"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c"} Apr 22 19:18:00.855029 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:00.854978 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.36:8643/healthz\": dial tcp 10.132.0.36:8643: connect: connection refused" Apr 22 19:18:00.859362 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:00.859330 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 22 19:18:01.762052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.762026 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:18:01.805436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805348 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location\") pod \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " Apr 22 19:18:01.805436 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805399 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8stm5\" (UniqueName: \"kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5\") pod \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " Apr 22 19:18:01.805663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805456 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") pod \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " Apr 22 19:18:01.805663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805524 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") pod \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\" (UID: \"42e6a5a5-4a6e-460c-8f0d-18429a9918fc\") " Apr 22 19:18:01.805756 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805672 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "42e6a5a5-4a6e-460c-8f0d-18429a9918fc" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:18:01.805879 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.805855 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config") pod "42e6a5a5-4a6e-460c-8f0d-18429a9918fc" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc"). InnerVolumeSpecName "isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:18:01.807520 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.807492 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "42e6a5a5-4a6e-460c-8f0d-18429a9918fc" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:18:01.807625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.807554 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5" (OuterVolumeSpecName: "kube-api-access-8stm5") pod "42e6a5a5-4a6e-460c-8f0d-18429a9918fc" (UID: "42e6a5a5-4a6e-460c-8f0d-18429a9918fc"). InnerVolumeSpecName "kube-api-access-8stm5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:18:01.906991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.906949 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:18:01.906991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.906985 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8stm5\" (UniqueName: \"kubernetes.io/projected/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-kube-api-access-8stm5\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:18:01.906991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.907000 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-isvc-pmml-v2-kserve-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:18:01.907471 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:01.907013 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/42e6a5a5-4a6e-460c-8f0d-18429a9918fc-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:18:02.095112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.095009 2579 generic.go:358] "Generic (PLEG): container finished" podID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerID="c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd" exitCode=0 Apr 22 19:18:02.095112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.095091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerDied","Data":"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd"} Apr 22 19:18:02.095112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.095102 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" Apr 22 19:18:02.095408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.095129 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf" event={"ID":"42e6a5a5-4a6e-460c-8f0d-18429a9918fc","Type":"ContainerDied","Data":"7adcdd365c403cd648d2fe8a67de64c15186aa74ea906eb242c6c2d5829370ba"} Apr 22 19:18:02.095408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.095145 2579 scope.go:117] "RemoveContainer" containerID="e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c" Apr 22 19:18:02.103056 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.103036 2579 scope.go:117] "RemoveContainer" containerID="c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd" Apr 22 19:18:02.109921 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.109902 2579 scope.go:117] "RemoveContainer" containerID="8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59" Apr 22 19:18:02.116334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.116315 2579 scope.go:117] "RemoveContainer" containerID="e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c" Apr 22 19:18:02.116574 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:18:02.116552 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c\": container with ID starting with e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c not found: ID does not exist" containerID="e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c" Apr 22 19:18:02.116624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.116585 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c"} err="failed to get container status \"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c\": rpc error: code = NotFound desc = could not find container \"e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c\": container with ID starting with e34546451f876bbaa537d4b68439de19bfbc2c7b3e3414c982ef6d952b7efb0c not found: ID does not exist" Apr 22 19:18:02.116624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.116612 2579 scope.go:117] "RemoveContainer" containerID="c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd" Apr 22 19:18:02.116803 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:18:02.116785 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd\": container with ID starting with c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd not found: ID does not exist" containerID="c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd" Apr 22 19:18:02.116840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.116809 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd"} err="failed to get container status \"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd\": rpc error: code = NotFound desc = could not find container \"c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd\": container with ID starting with c7af23a7cc36422bc218ced9c33fbdfe47bf1964eab7a16ada41938d02d7e0dd not found: ID does not exist" Apr 22 19:18:02.116840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.116825 2579 scope.go:117] "RemoveContainer" containerID="8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59" Apr 22 19:18:02.117001 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:18:02.116982 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59\": container with ID starting with 8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59 not found: ID does not exist" containerID="8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59" Apr 22 19:18:02.117100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.117005 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59"} err="failed to get container status \"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59\": rpc error: code = NotFound desc = could not find container \"8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59\": container with ID starting with 8ad909bfed48bfc1b860fc9ce6d8baaf621b540a96951f9b7b49e5aeceb02f59 not found: ID does not exist" Apr 22 19:18:02.120655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.120633 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:18:02.123918 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.123897 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-6578f8ffc7-9zbpf"] Apr 22 19:18:02.543096 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:02.543065 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" path="/var/lib/kubelet/pods/42e6a5a5-4a6e-460c-8f0d-18429a9918fc/volumes" Apr 22 19:18:03.099275 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:03.099241 2579 generic.go:358] "Generic (PLEG): container finished" podID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerID="f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d" exitCode=0 Apr 22 19:18:03.099692 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:03.099315 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerDied","Data":"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d"} Apr 22 19:18:04.104550 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:04.104512 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerStarted","Data":"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500"} Apr 22 19:18:04.104938 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:04.104559 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerStarted","Data":"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3"} Apr 22 19:18:04.104938 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:04.104764 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:18:04.134763 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:04.134707 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podStartSLOduration=6.134689147 podStartE2EDuration="6.134689147s" podCreationTimestamp="2026-04-22 19:17:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:18:04.132135406 +0000 UTC m=+1896.131709311" watchObservedRunningTime="2026-04-22 19:18:04.134689147 +0000 UTC m=+1896.134263047" Apr 22 19:18:05.107056 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:05.107014 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:18:05.107897 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:05.107875 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:06.110778 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:06.110736 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:11.119611 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:11.119580 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:18:11.120182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:11.120157 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:21.120409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:21.120368 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:31.120525 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:31.120484 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:41.120778 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:41.120736 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:18:51.120713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:18:51.120675 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:19:01.120941 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:01.120901 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 22 19:19:11.120874 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:11.120844 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:19:18.844980 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.844950 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:18.850860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.850828 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="storage-initializer" Apr 22 19:19:18.851061 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851046 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="storage-initializer" Apr 22 19:19:18.851170 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851159 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kube-rbac-proxy" Apr 22 19:19:18.851246 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851238 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kube-rbac-proxy" Apr 22 19:19:18.851376 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851365 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" Apr 22 19:19:18.851462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851451 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" Apr 22 19:19:18.851643 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851625 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kserve-container" Apr 22 19:19:18.851717 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.851647 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="42e6a5a5-4a6e-460c-8f0d-18429a9918fc" containerName="kube-rbac-proxy" Apr 22 19:19:18.855210 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.855189 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.860337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.860321 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-dc4537\"" Apr 22 19:19:18.861030 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.861004 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\"" Apr 22 19:19:18.861128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.861042 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-secondary-dc4537-predictor-serving-cert\"" Apr 22 19:19:18.862034 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.862014 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:19:18.864869 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.864850 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-dc4537-dockercfg-bcqgp\"" Apr 22 19:19:18.875113 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.875094 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:18.895635 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.895609 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.895753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.895675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.895753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.895708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.895753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.895733 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdkxv\" (UniqueName: \"kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.895930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.895809 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.996870 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.996838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.996870 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.996869 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997070 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.996894 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdkxv\" (UniqueName: \"kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997070 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.996920 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997070 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.996977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997329 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.997307 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.997534 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.997630 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.997611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:18.999343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:18.999320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:19.010726 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:19.010703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdkxv\" (UniqueName: \"kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv\") pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:19.165432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:19.165323 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:19.299914 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:19.299887 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:19.300933 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:19:19.300901 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d01723a_adfe_4e29_b986_a26c49f85a34.slice/crio-a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058 WatchSource:0}: Error finding container a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058: Status 404 returned error can't find the container with id a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058 Apr 22 19:19:19.312306 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:19.312255 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerStarted","Data":"a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058"} Apr 22 19:19:20.317186 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:20.317147 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerStarted","Data":"a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7"} Apr 22 19:19:22.324551 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:22.324522 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/0.log" Apr 22 19:19:22.324954 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:22.324558 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerID="a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7" exitCode=1 Apr 22 19:19:22.324954 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:22.324640 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerDied","Data":"a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7"} Apr 22 19:19:23.328582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:23.328556 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/0.log" Apr 22 19:19:23.328977 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:23.328662 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerStarted","Data":"d149632d0f708447093610f682208d349729b026906299dffb34a3530f0f9014"} Apr 22 19:19:28.344512 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.344487 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/1.log" Apr 22 19:19:28.344930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.344811 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/0.log" Apr 22 19:19:28.344930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.344843 2579 generic.go:358] "Generic (PLEG): container finished" podID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerID="d149632d0f708447093610f682208d349729b026906299dffb34a3530f0f9014" exitCode=1 Apr 22 19:19:28.344930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.344914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerDied","Data":"d149632d0f708447093610f682208d349729b026906299dffb34a3530f0f9014"} Apr 22 19:19:28.345044 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.344954 2579 scope.go:117] "RemoveContainer" containerID="a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7" Apr 22 19:19:28.345348 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:28.345332 2579 scope.go:117] "RemoveContainer" containerID="a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7" Apr 22 19:19:28.357040 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:28.357012 2579 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_kserve-ci-e2e-test_4d01723a-adfe-4e29-b986-a26c49f85a34_0 in pod sandbox a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058 from index: no such id: 'a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7'" containerID="a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7" Apr 22 19:19:28.357113 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:28.357063 2579 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_kserve-ci-e2e-test_4d01723a-adfe-4e29-b986-a26c49f85a34_0 in pod sandbox a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058 from index: no such id: 'a400a95cb44a1a262ecbc2bce991d0524b1188de6cb69538eb4408bab04e63d7'; Skipping pod \"isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_kserve-ci-e2e-test(4d01723a-adfe-4e29-b986-a26c49f85a34)\"" logger="UnhandledError" Apr 22 19:19:28.358433 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:28.358414 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_kserve-ci-e2e-test(4d01723a-adfe-4e29-b986-a26c49f85a34)\"" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" Apr 22 19:19:29.348777 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:29.348748 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/1.log" Apr 22 19:19:32.966866 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:32.966833 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:33.101542 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.101520 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/1.log" Apr 22 19:19:33.101644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.101582 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:33.109505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.109482 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:19:33.109771 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.109751 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" containerID="cri-o://f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3" gracePeriod=30 Apr 22 19:19:33.109865 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.109842 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kube-rbac-proxy" containerID="cri-o://8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500" gracePeriod=30 Apr 22 19:19:33.208001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.207963 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert\") pod \"4d01723a-adfe-4e29-b986-a26c49f85a34\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " Apr 22 19:19:33.208168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208016 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdkxv\" (UniqueName: \"kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv\") pod \"4d01723a-adfe-4e29-b986-a26c49f85a34\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " Apr 22 19:19:33.208168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208042 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location\") pod \"4d01723a-adfe-4e29-b986-a26c49f85a34\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " Apr 22 19:19:33.208168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208150 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls\") pod \"4d01723a-adfe-4e29-b986-a26c49f85a34\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " Apr 22 19:19:33.208368 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208182 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config\") pod \"4d01723a-adfe-4e29-b986-a26c49f85a34\" (UID: \"4d01723a-adfe-4e29-b986-a26c49f85a34\") " Apr 22 19:19:33.208368 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208332 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d01723a-adfe-4e29-b986-a26c49f85a34" (UID: "4d01723a-adfe-4e29-b986-a26c49f85a34"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:19:33.208489 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208477 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d01723a-adfe-4e29-b986-a26c49f85a34-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.208552 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208527 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "4d01723a-adfe-4e29-b986-a26c49f85a34" (UID: "4d01723a-adfe-4e29-b986-a26c49f85a34"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:33.208611 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.208586 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-secondary-dc4537-kube-rbac-proxy-sar-config") pod "4d01723a-adfe-4e29-b986-a26c49f85a34" (UID: "4d01723a-adfe-4e29-b986-a26c49f85a34"). InnerVolumeSpecName "isvc-secondary-dc4537-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:33.210185 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.210158 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4d01723a-adfe-4e29-b986-a26c49f85a34" (UID: "4d01723a-adfe-4e29-b986-a26c49f85a34"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:19:33.210470 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.210216 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv" (OuterVolumeSpecName: "kube-api-access-xdkxv") pod "4d01723a-adfe-4e29-b986-a26c49f85a34" (UID: "4d01723a-adfe-4e29-b986-a26c49f85a34"). InnerVolumeSpecName "kube-api-access-xdkxv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:33.275131 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275097 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:33.275428 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275414 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.275476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275430 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.275476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275443 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.275476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275450 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.275575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275490 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.275609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.275583 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" containerName="storage-initializer" Apr 22 19:19:33.278341 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.278325 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.283129 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.283108 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\"" Apr 22 19:19:33.283338 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.283320 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-8d3837-dockercfg-srq9z\"" Apr 22 19:19:33.283425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.283320 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-init-fail-8d3837-predictor-serving-cert\"" Apr 22 19:19:33.283425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.283322 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-8d3837\"" Apr 22 19:19:33.295602 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.295572 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:33.309759 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.309732 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4d01723a-adfe-4e29-b986-a26c49f85a34-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.309759 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.309761 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-secondary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-isvc-secondary-dc4537-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.309932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.309774 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/4d01723a-adfe-4e29-b986-a26c49f85a34-cabundle-cert\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.309932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.309785 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xdkxv\" (UniqueName: \"kubernetes.io/projected/4d01723a-adfe-4e29-b986-a26c49f85a34-kube-api-access-xdkxv\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:33.361663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.361631 2579 generic.go:358] "Generic (PLEG): container finished" podID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerID="8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500" exitCode=2 Apr 22 19:19:33.361819 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.361714 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerDied","Data":"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500"} Apr 22 19:19:33.362725 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.362705 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-dc4537-predictor-b7d55cb99-87v9f_4d01723a-adfe-4e29-b986-a26c49f85a34/storage-initializer/1.log" Apr 22 19:19:33.362835 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.362770 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" event={"ID":"4d01723a-adfe-4e29-b986-a26c49f85a34","Type":"ContainerDied","Data":"a909c46abba87ded5298b6bd189d6f929e6696529baf555956279a7b49e98058"} Apr 22 19:19:33.362835 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.362812 2579 scope.go:117] "RemoveContainer" containerID="d149632d0f708447093610f682208d349729b026906299dffb34a3530f0f9014" Apr 22 19:19:33.362907 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.362855 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f" Apr 22 19:19:33.410740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.410717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.410860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.410753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cz8dc\" (UniqueName: \"kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.410860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.410810 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.410860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.410832 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.410985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.410874 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.434941 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.434913 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:33.436574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.436550 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-dc4537-predictor-b7d55cb99-87v9f"] Apr 22 19:19:33.511933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.511851 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.511933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.511887 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.511933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.511909 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.511933 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.511933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.512224 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.512046 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cz8dc\" (UniqueName: \"kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.512354 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.512328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.512600 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.512579 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.512706 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.512677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.514463 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.514447 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.528472 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.528446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cz8dc\" (UniqueName: \"kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc\") pod \"isvc-init-fail-8d3837-predictor-6498f96895-xwwcg\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.589593 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.589560 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:33.718050 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:33.718024 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:33.719190 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:19:33.719168 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c4280d_72d4_46c7_9308_cfc76d461cf8.slice/crio-632563bcc47a2b95976cb74445b87ba4c1aaa18def0b410b35df987d1a85764e WatchSource:0}: Error finding container 632563bcc47a2b95976cb74445b87ba4c1aaa18def0b410b35df987d1a85764e: Status 404 returned error can't find the container with id 632563bcc47a2b95976cb74445b87ba4c1aaa18def0b410b35df987d1a85764e Apr 22 19:19:34.366990 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:34.366954 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerStarted","Data":"c2428227b935513f0cd361396ef9fbc63d2b459c902889a48a57e532739e00d7"} Apr 22 19:19:34.366990 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:34.366997 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerStarted","Data":"632563bcc47a2b95976cb74445b87ba4c1aaa18def0b410b35df987d1a85764e"} Apr 22 19:19:34.545078 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:34.545040 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d01723a-adfe-4e29-b986-a26c49f85a34" path="/var/lib/kubelet/pods/4d01723a-adfe-4e29-b986-a26c49f85a34/volumes" Apr 22 19:19:36.111308 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:36.111244 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.37:8643/healthz\": dial tcp 10.132.0.37:8643: connect: connection refused" Apr 22 19:19:37.241274 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.241238 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:19:37.344281 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344198 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location\") pod \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " Apr 22 19:19:37.344281 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344248 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w49vq\" (UniqueName: \"kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq\") pod \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " Apr 22 19:19:37.344453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344298 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-primary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config\") pod \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " Apr 22 19:19:37.344453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344360 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls\") pod \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\" (UID: \"2fd9e652-319d-4a2f-be89-9a5069ebc52c\") " Apr 22 19:19:37.344580 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344557 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2fd9e652-319d-4a2f-be89-9a5069ebc52c" (UID: "2fd9e652-319d-4a2f-be89-9a5069ebc52c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:19:37.344641 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.344619 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-primary-dc4537-kube-rbac-proxy-sar-config") pod "2fd9e652-319d-4a2f-be89-9a5069ebc52c" (UID: "2fd9e652-319d-4a2f-be89-9a5069ebc52c"). InnerVolumeSpecName "isvc-primary-dc4537-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:37.346401 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.346376 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq" (OuterVolumeSpecName: "kube-api-access-w49vq") pod "2fd9e652-319d-4a2f-be89-9a5069ebc52c" (UID: "2fd9e652-319d-4a2f-be89-9a5069ebc52c"). InnerVolumeSpecName "kube-api-access-w49vq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:37.346401 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.346395 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2fd9e652-319d-4a2f-be89-9a5069ebc52c" (UID: "2fd9e652-319d-4a2f-be89-9a5069ebc52c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:19:37.378148 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.378113 2579 generic.go:358] "Generic (PLEG): container finished" podID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerID="f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3" exitCode=0 Apr 22 19:19:37.378255 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.378168 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerDied","Data":"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3"} Apr 22 19:19:37.378255 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.378196 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" Apr 22 19:19:37.378255 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.378208 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2" event={"ID":"2fd9e652-319d-4a2f-be89-9a5069ebc52c","Type":"ContainerDied","Data":"717ce68d707f5631d19d0d32b135c24dec40c5b02f9509307389162aabd7d7e3"} Apr 22 19:19:37.378255 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.378228 2579 scope.go:117] "RemoveContainer" containerID="8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500" Apr 22 19:19:37.386253 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.386127 2579 scope.go:117] "RemoveContainer" containerID="f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3" Apr 22 19:19:37.393167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.393150 2579 scope.go:117] "RemoveContainer" containerID="f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d" Apr 22 19:19:37.399971 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.399953 2579 scope.go:117] "RemoveContainer" containerID="8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500" Apr 22 19:19:37.400203 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:37.400185 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500\": container with ID starting with 8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500 not found: ID does not exist" containerID="8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500" Apr 22 19:19:37.400245 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.400209 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500"} err="failed to get container status \"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500\": rpc error: code = NotFound desc = could not find container \"8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500\": container with ID starting with 8f592cea25ff373044d8b1bb46167452e081b6e5a0273a985b521a514d619500 not found: ID does not exist" Apr 22 19:19:37.400245 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.400226 2579 scope.go:117] "RemoveContainer" containerID="f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3" Apr 22 19:19:37.400456 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:37.400440 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3\": container with ID starting with f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3 not found: ID does not exist" containerID="f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3" Apr 22 19:19:37.400510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.400459 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3"} err="failed to get container status \"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3\": rpc error: code = NotFound desc = could not find container \"f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3\": container with ID starting with f240d1043cd475142ce2e8e6e97c44b9a7def1fb0969655db9dcd5fbf36309c3 not found: ID does not exist" Apr 22 19:19:37.400510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.400473 2579 scope.go:117] "RemoveContainer" containerID="f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d" Apr 22 19:19:37.400694 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:19:37.400675 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d\": container with ID starting with f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d not found: ID does not exist" containerID="f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d" Apr 22 19:19:37.400737 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.400701 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d"} err="failed to get container status \"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d\": rpc error: code = NotFound desc = could not find container \"f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d\": container with ID starting with f349b919e41ad1d073ee8d55e5cdbbc9ce64c1ab3e8f0b427de7020ab6f7bc6d not found: ID does not exist" Apr 22 19:19:37.415740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.415715 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:19:37.422215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.422195 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-dc4537-predictor-574dc8f9fb-bpdk2"] Apr 22 19:19:37.445451 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.445430 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-primary-dc4537-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2fd9e652-319d-4a2f-be89-9a5069ebc52c-isvc-primary-dc4537-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:37.445512 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.445451 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2fd9e652-319d-4a2f-be89-9a5069ebc52c-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:37.445512 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.445462 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:37.445512 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:37.445470 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w49vq\" (UniqueName: \"kubernetes.io/projected/2fd9e652-319d-4a2f-be89-9a5069ebc52c-kube-api-access-w49vq\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:38.382695 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:38.382620 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/0.log" Apr 22 19:19:38.382695 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:38.382657 2579 generic.go:358] "Generic (PLEG): container finished" podID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerID="c2428227b935513f0cd361396ef9fbc63d2b459c902889a48a57e532739e00d7" exitCode=1 Apr 22 19:19:38.383146 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:38.382736 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerDied","Data":"c2428227b935513f0cd361396ef9fbc63d2b459c902889a48a57e532739e00d7"} Apr 22 19:19:38.543206 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:38.543174 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" path="/var/lib/kubelet/pods/2fd9e652-319d-4a2f-be89-9a5069ebc52c/volumes" Apr 22 19:19:39.388331 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:39.388304 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/0.log" Apr 22 19:19:39.388748 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:39.388386 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerStarted","Data":"29566a8e760442d18ffe6c0728ae9167a708966246898935f08e6c01a0ace3f6"} Apr 22 19:19:43.184682 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.184649 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:43.185049 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.184900 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" containerID="cri-o://29566a8e760442d18ffe6c0728ae9167a708966246898935f08e6c01a0ace3f6" gracePeriod=30 Apr 22 19:19:43.401687 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.401650 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/1.log" Apr 22 19:19:43.402047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.402031 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/0.log" Apr 22 19:19:43.402141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.402065 2579 generic.go:358] "Generic (PLEG): container finished" podID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerID="29566a8e760442d18ffe6c0728ae9167a708966246898935f08e6c01a0ace3f6" exitCode=1 Apr 22 19:19:43.402141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.402100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerDied","Data":"29566a8e760442d18ffe6c0728ae9167a708966246898935f08e6c01a0ace3f6"} Apr 22 19:19:43.402141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.402131 2579 scope.go:117] "RemoveContainer" containerID="c2428227b935513f0cd361396ef9fbc63d2b459c902889a48a57e532739e00d7" Apr 22 19:19:43.439517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439434 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:19:43.439713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439699 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" Apr 22 19:19:43.439713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439713 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439722 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kube-rbac-proxy" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439727 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kube-rbac-proxy" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439742 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="storage-initializer" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439748 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="storage-initializer" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439795 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kserve-container" Apr 22 19:19:43.439844 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.439803 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fd9e652-319d-4a2f-be89-9a5069ebc52c" containerName="kube-rbac-proxy" Apr 22 19:19:43.443818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.443798 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.451028 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.450999 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\"" Apr 22 19:19:43.451152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.451040 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-predictor-serving-cert\"" Apr 22 19:19:43.451238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.451199 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bz9sf\"" Apr 22 19:19:43.494221 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.494191 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:19:43.527233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.527207 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/1.log" Apr 22 19:19:43.527362 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.527307 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:43.591201 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.591173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.591381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.591206 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.591381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.591234 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8clld\" (UniqueName: \"kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.591381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.591336 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.692054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.691968 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\") pod \"12c4280d-72d4-46c7-9308-cfc76d461cf8\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " Apr 22 19:19:43.692219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692075 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location\") pod \"12c4280d-72d4-46c7-9308-cfc76d461cf8\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " Apr 22 19:19:43.692219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692113 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cz8dc\" (UniqueName: \"kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc\") pod \"12c4280d-72d4-46c7-9308-cfc76d461cf8\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " Apr 22 19:19:43.692219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692143 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls\") pod \"12c4280d-72d4-46c7-9308-cfc76d461cf8\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " Apr 22 19:19:43.692219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692183 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert\") pod \"12c4280d-72d4-46c7-9308-cfc76d461cf8\" (UID: \"12c4280d-72d4-46c7-9308-cfc76d461cf8\") " Apr 22 19:19:43.692479 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.692479 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692417 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.692479 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692430 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12c4280d-72d4-46c7-9308-cfc76d461cf8" (UID: "12c4280d-72d4-46c7-9308-cfc76d461cf8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:19:43.692616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692435 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-init-fail-8d3837-kube-rbac-proxy-sar-config") pod "12c4280d-72d4-46c7-9308-cfc76d461cf8" (UID: "12c4280d-72d4-46c7-9308-cfc76d461cf8"). InnerVolumeSpecName "isvc-init-fail-8d3837-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:43.692616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692450 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.692616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692559 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8clld\" (UniqueName: \"kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.692616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692605 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-isvc-init-fail-8d3837-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:43.692821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692624 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12c4280d-72d4-46c7-9308-cfc76d461cf8-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:43.692821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692617 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "12c4280d-72d4-46c7-9308-cfc76d461cf8" (UID: "12c4280d-72d4-46c7-9308-cfc76d461cf8"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:19:43.692821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.692797 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.693170 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.693147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.694350 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.694325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "12c4280d-72d4-46c7-9308-cfc76d461cf8" (UID: "12c4280d-72d4-46c7-9308-cfc76d461cf8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:19:43.694708 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.694681 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc" (OuterVolumeSpecName: "kube-api-access-cz8dc") pod "12c4280d-72d4-46c7-9308-cfc76d461cf8" (UID: "12c4280d-72d4-46c7-9308-cfc76d461cf8"). InnerVolumeSpecName "kube-api-access-cz8dc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:19:43.694994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.694975 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.706779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.706758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8clld\" (UniqueName: \"kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld\") pod \"isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.755845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.755810 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:19:43.793753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.793718 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cz8dc\" (UniqueName: \"kubernetes.io/projected/12c4280d-72d4-46c7-9308-cfc76d461cf8-kube-api-access-cz8dc\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:43.793753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.793752 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/12c4280d-72d4-46c7-9308-cfc76d461cf8-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:43.793930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.793765 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/12c4280d-72d4-46c7-9308-cfc76d461cf8-cabundle-cert\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:19:43.890142 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:43.890121 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:19:43.891813 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:19:43.891775 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4de789a8_a689_462e_9ab7_1a48b43e2707.slice/crio-db2e9ea0cc51408b43e2c81d9dc44f998505708b34b4cf608196de60943b3592 WatchSource:0}: Error finding container db2e9ea0cc51408b43e2c81d9dc44f998505708b34b4cf608196de60943b3592: Status 404 returned error can't find the container with id db2e9ea0cc51408b43e2c81d9dc44f998505708b34b4cf608196de60943b3592 Apr 22 19:19:44.406017 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.405925 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-8d3837-predictor-6498f96895-xwwcg_12c4280d-72d4-46c7-9308-cfc76d461cf8/storage-initializer/1.log" Apr 22 19:19:44.406017 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.406006 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" event={"ID":"12c4280d-72d4-46c7-9308-cfc76d461cf8","Type":"ContainerDied","Data":"632563bcc47a2b95976cb74445b87ba4c1aaa18def0b410b35df987d1a85764e"} Apr 22 19:19:44.406544 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.406033 2579 scope.go:117] "RemoveContainer" containerID="29566a8e760442d18ffe6c0728ae9167a708966246898935f08e6c01a0ace3f6" Apr 22 19:19:44.406544 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.406105 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg" Apr 22 19:19:44.407722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.407699 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerStarted","Data":"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed"} Apr 22 19:19:44.407722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.407725 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerStarted","Data":"db2e9ea0cc51408b43e2c81d9dc44f998505708b34b4cf608196de60943b3592"} Apr 22 19:19:44.518433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.518399 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:44.528658 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.528634 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-8d3837-predictor-6498f96895-xwwcg"] Apr 22 19:19:44.543653 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:44.543628 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" path="/var/lib/kubelet/pods/12c4280d-72d4-46c7-9308-cfc76d461cf8/volumes" Apr 22 19:19:48.418978 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:48.418949 2579 generic.go:358] "Generic (PLEG): container finished" podID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerID="f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed" exitCode=0 Apr 22 19:19:48.419373 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:19:48.419032 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerDied","Data":"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed"} Apr 22 19:20:12.497681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:12.497598 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerStarted","Data":"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0"} Apr 22 19:20:12.497681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:12.497639 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerStarted","Data":"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2"} Apr 22 19:20:12.498229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:12.497836 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:20:12.528934 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:12.528893 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podStartSLOduration=5.759814191 podStartE2EDuration="29.528880328s" podCreationTimestamp="2026-04-22 19:19:43 +0000 UTC" firstStartedPulling="2026-04-22 19:19:48.420108996 +0000 UTC m=+2000.419682875" lastFinishedPulling="2026-04-22 19:20:12.189175134 +0000 UTC m=+2024.188749012" observedRunningTime="2026-04-22 19:20:12.526351766 +0000 UTC m=+2024.525925663" watchObservedRunningTime="2026-04-22 19:20:12.528880328 +0000 UTC m=+2024.528454225" Apr 22 19:20:13.501584 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:13.501549 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:20:13.502856 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:13.502831 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:14.504610 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:14.504569 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:19.509350 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:19.509317 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:20:19.509935 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:19.509906 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:29.509934 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:29.509894 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:39.509816 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:39.509778 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:49.510514 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:49.510426 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:20:59.510491 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:20:59.510454 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:21:09.509847 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:09.509806 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:21:19.510433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:19.510394 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 22 19:21:29.510422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:29.510391 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:21:33.377159 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.377126 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:21:33.377651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.377428 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" containerID="cri-o://fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2" gracePeriod=30 Apr 22 19:21:33.377651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.377476 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kube-rbac-proxy" containerID="cri-o://39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0" gracePeriod=30 Apr 22 19:21:33.491619 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.491589 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:21:33.491955 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.491936 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.492024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.491959 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.492024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.491975 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.492024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.491983 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.492124 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.492063 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.492124 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.492074 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="12c4280d-72d4-46c7-9308-cfc76d461cf8" containerName="storage-initializer" Apr 22 19:21:33.495035 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.495019 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.497712 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.497695 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\"" Apr 22 19:21:33.497826 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.497806 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-predictor-serving-cert\"" Apr 22 19:21:33.508930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.504821 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:21:33.523422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.523398 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622d2\" (UniqueName: \"kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.523515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.523435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.523515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.523462 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.523515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.523479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.623853 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.623813 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.623853 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.623853 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.624079 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.623899 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-622d2\" (UniqueName: \"kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.624079 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.623926 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.624328 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.624302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.624567 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.624549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.626355 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.626336 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.633697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.633640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-622d2\" (UniqueName: \"kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2\") pod \"isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.720585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.720551 2579 generic.go:358] "Generic (PLEG): container finished" podID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerID="39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0" exitCode=2 Apr 22 19:21:33.720729 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.720625 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerDied","Data":"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0"} Apr 22 19:21:33.806278 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.806228 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:33.926468 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:33.926400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:21:33.929580 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:21:33.929553 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod767cd722_e427_4d62_9c92_20920c8248dc.slice/crio-8ea7052bf36040a23427911fd88944a189117b6057dafbb036a602ccca81c548 WatchSource:0}: Error finding container 8ea7052bf36040a23427911fd88944a189117b6057dafbb036a602ccca81c548: Status 404 returned error can't find the container with id 8ea7052bf36040a23427911fd88944a189117b6057dafbb036a602ccca81c548 Apr 22 19:21:34.504826 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:34.504775 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.40:8643/healthz\": dial tcp 10.132.0.40:8643: connect: connection refused" Apr 22 19:21:34.724404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:34.724371 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerStarted","Data":"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183"} Apr 22 19:21:34.724404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:34.724407 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerStarted","Data":"8ea7052bf36040a23427911fd88944a189117b6057dafbb036a602ccca81c548"} Apr 22 19:21:38.116235 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.116213 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:21:38.155210 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155140 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location\") pod \"4de789a8-a689-462e-9ab7-1a48b43e2707\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " Apr 22 19:21:38.155210 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155191 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") pod \"4de789a8-a689-462e-9ab7-1a48b43e2707\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " Apr 22 19:21:38.155428 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155220 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls\") pod \"4de789a8-a689-462e-9ab7-1a48b43e2707\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " Apr 22 19:21:38.155428 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155238 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8clld\" (UniqueName: \"kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld\") pod \"4de789a8-a689-462e-9ab7-1a48b43e2707\" (UID: \"4de789a8-a689-462e-9ab7-1a48b43e2707\") " Apr 22 19:21:38.155536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155508 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4de789a8-a689-462e-9ab7-1a48b43e2707" (UID: "4de789a8-a689-462e-9ab7-1a48b43e2707"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:21:38.155588 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.155531 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-kube-rbac-proxy-sar-config") pod "4de789a8-a689-462e-9ab7-1a48b43e2707" (UID: "4de789a8-a689-462e-9ab7-1a48b43e2707"). InnerVolumeSpecName "isvc-predictive-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:21:38.157315 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.157287 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "4de789a8-a689-462e-9ab7-1a48b43e2707" (UID: "4de789a8-a689-462e-9ab7-1a48b43e2707"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:21:38.157315 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.157301 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld" (OuterVolumeSpecName: "kube-api-access-8clld") pod "4de789a8-a689-462e-9ab7-1a48b43e2707" (UID: "4de789a8-a689-462e-9ab7-1a48b43e2707"). InnerVolumeSpecName "kube-api-access-8clld". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:21:38.256379 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.256326 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4de789a8-a689-462e-9ab7-1a48b43e2707-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:21:38.256379 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.256373 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/4de789a8-a689-462e-9ab7-1a48b43e2707-isvc-predictive-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:21:38.256379 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.256388 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4de789a8-a689-462e-9ab7-1a48b43e2707-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:21:38.256627 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.256404 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8clld\" (UniqueName: \"kubernetes.io/projected/4de789a8-a689-462e-9ab7-1a48b43e2707-kube-api-access-8clld\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:21:38.735736 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.735704 2579 generic.go:358] "Generic (PLEG): container finished" podID="767cd722-e427-4d62-9c92-20920c8248dc" containerID="6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183" exitCode=0 Apr 22 19:21:38.735900 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.735779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerDied","Data":"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183"} Apr 22 19:21:38.737445 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.737425 2579 generic.go:358] "Generic (PLEG): container finished" podID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerID="fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2" exitCode=0 Apr 22 19:21:38.737554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.737465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerDied","Data":"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2"} Apr 22 19:21:38.737554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.737483 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" event={"ID":"4de789a8-a689-462e-9ab7-1a48b43e2707","Type":"ContainerDied","Data":"db2e9ea0cc51408b43e2c81d9dc44f998505708b34b4cf608196de60943b3592"} Apr 22 19:21:38.737554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.737496 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7" Apr 22 19:21:38.737671 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.737498 2579 scope.go:117] "RemoveContainer" containerID="39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0" Apr 22 19:21:38.747498 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.747482 2579 scope.go:117] "RemoveContainer" containerID="fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2" Apr 22 19:21:38.754324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.754243 2579 scope.go:117] "RemoveContainer" containerID="f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed" Apr 22 19:21:38.761839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.761820 2579 scope.go:117] "RemoveContainer" containerID="39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0" Apr 22 19:21:38.762121 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:21:38.762101 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0\": container with ID starting with 39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0 not found: ID does not exist" containerID="39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0" Apr 22 19:21:38.762198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.762129 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0"} err="failed to get container status \"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0\": rpc error: code = NotFound desc = could not find container \"39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0\": container with ID starting with 39fd6e721327881c91637cb943906c1bebaa7bb1bc26388c66dce3fdb05e11a0 not found: ID does not exist" Apr 22 19:21:38.762198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.762146 2579 scope.go:117] "RemoveContainer" containerID="fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2" Apr 22 19:21:38.762503 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:21:38.762482 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2\": container with ID starting with fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2 not found: ID does not exist" containerID="fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2" Apr 22 19:21:38.762549 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.762512 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2"} err="failed to get container status \"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2\": rpc error: code = NotFound desc = could not find container \"fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2\": container with ID starting with fd7b6b7f6c6789d001f18b462906d5e8daac8543d6ae04ccc74c16d983d246f2 not found: ID does not exist" Apr 22 19:21:38.762549 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.762533 2579 scope.go:117] "RemoveContainer" containerID="f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed" Apr 22 19:21:38.762825 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:21:38.762798 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed\": container with ID starting with f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed not found: ID does not exist" containerID="f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed" Apr 22 19:21:38.762886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.762828 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed"} err="failed to get container status \"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed\": rpc error: code = NotFound desc = could not find container \"f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed\": container with ID starting with f44451d307a005ead266e3e9d20fcae01f54f6321d639096bbe2606ea798c5ed not found: ID does not exist" Apr 22 19:21:38.770489 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.770467 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:21:38.774592 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:38.774571 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-cd7c759c9-gxjh7"] Apr 22 19:21:39.742459 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.742423 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerStarted","Data":"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826"} Apr 22 19:21:39.742459 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.742464 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerStarted","Data":"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be"} Apr 22 19:21:39.742979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.742811 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:39.742979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.742844 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:39.743845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.743814 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:21:39.763289 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:39.763226 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podStartSLOduration=6.763215558 podStartE2EDuration="6.763215558s" podCreationTimestamp="2026-04-22 19:21:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:21:39.7625558 +0000 UTC m=+2111.762129721" watchObservedRunningTime="2026-04-22 19:21:39.763215558 +0000 UTC m=+2111.762789456" Apr 22 19:21:40.542890 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:40.542853 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" path="/var/lib/kubelet/pods/4de789a8-a689-462e-9ab7-1a48b43e2707/volumes" Apr 22 19:21:40.746224 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:40.746188 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:21:45.750125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:45.750097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:21:45.750585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:45.750552 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:21:55.750774 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:21:55.750738 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:05.751318 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:05.751256 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:15.750755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:15.750720 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:25.751185 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:25.751137 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:35.751171 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:35.751132 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:45.751408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:45.751371 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 22 19:22:53.540080 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:22:53.540053 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:23:03.604716 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.604679 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:23:03.605182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.605070 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" containerID="cri-o://7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be" gracePeriod=30 Apr 22 19:23:03.605182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.605107 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kube-rbac-proxy" containerID="cri-o://6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826" gracePeriod=30 Apr 22 19:23:03.703666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703634 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:23:03.703956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703942 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kube-rbac-proxy" Apr 22 19:23:03.704021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703957 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kube-rbac-proxy" Apr 22 19:23:03.704021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703966 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" Apr 22 19:23:03.704021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703972 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" Apr 22 19:23:03.704021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703981 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="storage-initializer" Apr 22 19:23:03.704021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.703987 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="storage-initializer" Apr 22 19:23:03.704199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.704042 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kube-rbac-proxy" Apr 22 19:23:03.704199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.704053 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="4de789a8-a689-462e-9ab7-1a48b43e2707" containerName="kserve-container" Apr 22 19:23:03.707060 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.707038 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.709830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.709807 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\"" Apr 22 19:23:03.709944 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.709807 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-predictor-serving-cert\"" Apr 22 19:23:03.717181 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.717153 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:23:03.805360 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.805329 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.805500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.805379 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.805500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.805407 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.805500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.805428 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r75lt\" (UniqueName: \"kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.906414 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.906336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.906414 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.906372 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.906414 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.906395 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r75lt\" (UniqueName: \"kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.906648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.906437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.906769 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.906751 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.907112 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.907094 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.908822 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.908805 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.916228 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.916198 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r75lt\" (UniqueName: \"kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt\") pod \"isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:03.968655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.968624 2579 generic.go:358] "Generic (PLEG): container finished" podID="767cd722-e427-4d62-9c92-20920c8248dc" containerID="6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826" exitCode=2 Apr 22 19:23:03.968808 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:03.968696 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerDied","Data":"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826"} Apr 22 19:23:04.018982 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:04.018948 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:04.134410 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:04.134386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:23:04.136650 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:23:04.136610 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda827ea3a_a2b7_41fa_a963_8858cb40eb51.slice/crio-ea8ff5b8d9927e070696e454242cb25215da6f52953bbb2556cff399c9571832 WatchSource:0}: Error finding container ea8ff5b8d9927e070696e454242cb25215da6f52953bbb2556cff399c9571832: Status 404 returned error can't find the container with id ea8ff5b8d9927e070696e454242cb25215da6f52953bbb2556cff399c9571832 Apr 22 19:23:04.138453 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:04.138434 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:23:04.973152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:04.973114 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerStarted","Data":"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699"} Apr 22 19:23:04.973152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:04.973153 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerStarted","Data":"ea8ff5b8d9927e070696e454242cb25215da6f52953bbb2556cff399c9571832"} Apr 22 19:23:05.747355 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:05.747310 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.41:8643/healthz\": dial tcp 10.132.0.41:8643: connect: connection refused" Apr 22 19:23:07.983638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:07.983607 2579 generic.go:358] "Generic (PLEG): container finished" podID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerID="23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699" exitCode=0 Apr 22 19:23:07.983948 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:07.983647 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerDied","Data":"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699"} Apr 22 19:23:08.146624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.146602 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:23:08.243407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243374 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") pod \"767cd722-e427-4d62-9c92-20920c8248dc\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " Apr 22 19:23:08.243407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243414 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls\") pod \"767cd722-e427-4d62-9c92-20920c8248dc\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " Apr 22 19:23:08.243664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243457 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-622d2\" (UniqueName: \"kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2\") pod \"767cd722-e427-4d62-9c92-20920c8248dc\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " Apr 22 19:23:08.243664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243484 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location\") pod \"767cd722-e427-4d62-9c92-20920c8248dc\" (UID: \"767cd722-e427-4d62-9c92-20920c8248dc\") " Apr 22 19:23:08.243743 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243705 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-kube-rbac-proxy-sar-config") pod "767cd722-e427-4d62-9c92-20920c8248dc" (UID: "767cd722-e427-4d62-9c92-20920c8248dc"). InnerVolumeSpecName "isvc-predictive-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:23:08.243858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.243822 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "767cd722-e427-4d62-9c92-20920c8248dc" (UID: "767cd722-e427-4d62-9c92-20920c8248dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:23:08.245438 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.245420 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "767cd722-e427-4d62-9c92-20920c8248dc" (UID: "767cd722-e427-4d62-9c92-20920c8248dc"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:23:08.245523 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.245508 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2" (OuterVolumeSpecName: "kube-api-access-622d2") pod "767cd722-e427-4d62-9c92-20920c8248dc" (UID: "767cd722-e427-4d62-9c92-20920c8248dc"). InnerVolumeSpecName "kube-api-access-622d2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:23:08.344247 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.344215 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/767cd722-e427-4d62-9c92-20920c8248dc-isvc-predictive-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:23:08.344247 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.344245 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/767cd722-e427-4d62-9c92-20920c8248dc-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:23:08.344438 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.344258 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-622d2\" (UniqueName: \"kubernetes.io/projected/767cd722-e427-4d62-9c92-20920c8248dc-kube-api-access-622d2\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:23:08.344438 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.344283 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/767cd722-e427-4d62-9c92-20920c8248dc-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:23:08.988456 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.988421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerStarted","Data":"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84"} Apr 22 19:23:08.988882 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.988465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerStarted","Data":"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f"} Apr 22 19:23:08.988882 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.988787 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:08.989892 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.989871 2579 generic.go:358] "Generic (PLEG): container finished" podID="767cd722-e427-4d62-9c92-20920c8248dc" containerID="7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be" exitCode=0 Apr 22 19:23:08.989946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.989899 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerDied","Data":"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be"} Apr 22 19:23:08.989946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.989916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" event={"ID":"767cd722-e427-4d62-9c92-20920c8248dc","Type":"ContainerDied","Data":"8ea7052bf36040a23427911fd88944a189117b6057dafbb036a602ccca81c548"} Apr 22 19:23:08.989946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.989931 2579 scope.go:117] "RemoveContainer" containerID="6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826" Apr 22 19:23:08.990046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.989947 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps" Apr 22 19:23:08.997345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:08.997321 2579 scope.go:117] "RemoveContainer" containerID="7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be" Apr 22 19:23:09.003774 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.003759 2579 scope.go:117] "RemoveContainer" containerID="6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183" Apr 22 19:23:09.010684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.010630 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podStartSLOduration=6.010615977 podStartE2EDuration="6.010615977s" podCreationTimestamp="2026-04-22 19:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:23:09.009013142 +0000 UTC m=+2201.008587041" watchObservedRunningTime="2026-04-22 19:23:09.010615977 +0000 UTC m=+2201.010189880" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.011872 2579 scope.go:117] "RemoveContainer" containerID="6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:23:09.012742 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826\": container with ID starting with 6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826 not found: ID does not exist" containerID="6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.012772 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826"} err="failed to get container status \"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826\": rpc error: code = NotFound desc = could not find container \"6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826\": container with ID starting with 6c802709bf2c2d2e00ad1ff98913264f8c036c91fbed51b441fadb38cef8b826 not found: ID does not exist" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.012794 2579 scope.go:117] "RemoveContainer" containerID="7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:23:09.013129 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be\": container with ID starting with 7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be not found: ID does not exist" containerID="7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.013157 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be"} err="failed to get container status \"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be\": rpc error: code = NotFound desc = could not find container \"7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be\": container with ID starting with 7cfc1cbe465338ad5a2eacbeb6c8d72eb3c18c9384ce308a0f76f231c84424be not found: ID does not exist" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.013177 2579 scope.go:117] "RemoveContainer" containerID="6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:23:09.013454 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183\": container with ID starting with 6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183 not found: ID does not exist" containerID="6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183" Apr 22 19:23:09.013548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.013478 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183"} err="failed to get container status \"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183\": rpc error: code = NotFound desc = could not find container \"6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183\": container with ID starting with 6e6935a5c90e806d50c526d460168acd45167a1e9b084267a03efe8599ee7183 not found: ID does not exist" Apr 22 19:23:09.023595 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.023567 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:23:09.028413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.028394 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-7ff98fd74d-bjgps"] Apr 22 19:23:09.998917 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:09.998885 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:10.000384 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:10.000335 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:10.543841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:10.543810 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="767cd722-e427-4d62-9c92-20920c8248dc" path="/var/lib/kubelet/pods/767cd722-e427-4d62-9c92-20920c8248dc/volumes" Apr 22 19:23:11.001941 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:11.001899 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:16.006552 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:16.006525 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:23:16.007046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:16.007022 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:26.007279 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:26.007221 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:36.007669 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:36.007628 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:46.006994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:46.006902 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:23:56.007243 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:23:56.007204 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:24:06.007674 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:06.007638 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:24:16.007819 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:16.007772 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 22 19:24:23.540422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:23.540389 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:24:33.817221 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.817191 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:24:33.817699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.817548 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" containerID="cri-o://69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f" gracePeriod=30 Apr 22 19:24:33.817699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.817633 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kube-rbac-proxy" containerID="cri-o://81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84" gracePeriod=30 Apr 22 19:24:33.922539 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922508 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:24:33.922785 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922773 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" Apr 22 19:24:33.922830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922786 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" Apr 22 19:24:33.922830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922802 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="storage-initializer" Apr 22 19:24:33.922830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922808 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="storage-initializer" Apr 22 19:24:33.922830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922822 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kube-rbac-proxy" Apr 22 19:24:33.922830 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922828 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kube-rbac-proxy" Apr 22 19:24:33.922999 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922870 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kserve-container" Apr 22 19:24:33.922999 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.922881 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="767cd722-e427-4d62-9c92-20920c8248dc" containerName="kube-rbac-proxy" Apr 22 19:24:33.925874 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.925855 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:33.928944 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.928922 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-predictor-serving-cert\"" Apr 22 19:24:33.929087 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.929070 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 22 19:24:33.937137 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:33.937113 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:24:34.110530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.110452 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.110530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.110488 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.110700 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.110539 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbkvx\" (UniqueName: \"kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.110700 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.110589 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbkvx\" (UniqueName: \"kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211219 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211220 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211520 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211303 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211520 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211771 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211747 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.211910 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.211892 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.213870 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.213851 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.220909 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.220882 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbkvx\" (UniqueName: \"kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx\") pod \"isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.229343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.229317 2579 generic.go:358] "Generic (PLEG): container finished" podID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerID="81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84" exitCode=2 Apr 22 19:24:34.229461 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.229374 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerDied","Data":"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84"} Apr 22 19:24:34.237919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.237903 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:34.363649 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:34.360842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:24:35.233056 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:35.233018 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerStarted","Data":"8a3636d58e597b2937fee14a70e5278833a35354346f1235c9284595ca230895"} Apr 22 19:24:35.233056 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:35.233054 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerStarted","Data":"b3bb02a26728ce4bfda5968d11bd391f6fca4cf54a06fec98dedf6b2761506b7"} Apr 22 19:24:36.002210 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:36.002167 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.42:8643/healthz\": dial tcp 10.132.0.42:8643: connect: connection refused" Apr 22 19:24:38.241899 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:38.241867 2579 generic.go:358] "Generic (PLEG): container finished" podID="dae1c729-2093-4df1-9810-09136190db35" containerID="8a3636d58e597b2937fee14a70e5278833a35354346f1235c9284595ca230895" exitCode=0 Apr 22 19:24:38.242297 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:38.241924 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerDied","Data":"8a3636d58e597b2937fee14a70e5278833a35354346f1235c9284595ca230895"} Apr 22 19:24:38.967736 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:38.967712 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:24:39.151546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151455 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") pod \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " Apr 22 19:24:39.151546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151525 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls\") pod \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " Apr 22 19:24:39.151776 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location\") pod \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " Apr 22 19:24:39.151776 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151585 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r75lt\" (UniqueName: \"kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt\") pod \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\" (UID: \"a827ea3a-a2b7-41fa-a963-8858cb40eb51\") " Apr 22 19:24:39.151920 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151890 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config") pod "a827ea3a-a2b7-41fa-a963-8858cb40eb51" (UID: "a827ea3a-a2b7-41fa-a963-8858cb40eb51"). InnerVolumeSpecName "isvc-predictive-lightgbm-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:24:39.151963 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.151908 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a827ea3a-a2b7-41fa-a963-8858cb40eb51" (UID: "a827ea3a-a2b7-41fa-a963-8858cb40eb51"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:24:39.153659 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.153638 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a827ea3a-a2b7-41fa-a963-8858cb40eb51" (UID: "a827ea3a-a2b7-41fa-a963-8858cb40eb51"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:24:39.153745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.153678 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt" (OuterVolumeSpecName: "kube-api-access-r75lt") pod "a827ea3a-a2b7-41fa-a963-8858cb40eb51" (UID: "a827ea3a-a2b7-41fa-a963-8858cb40eb51"). InnerVolumeSpecName "kube-api-access-r75lt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:24:39.246039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.246001 2579 generic.go:358] "Generic (PLEG): container finished" podID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerID="69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f" exitCode=0 Apr 22 19:24:39.246510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.246085 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" Apr 22 19:24:39.246510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.246085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerDied","Data":"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f"} Apr 22 19:24:39.246510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.246216 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b" event={"ID":"a827ea3a-a2b7-41fa-a963-8858cb40eb51","Type":"ContainerDied","Data":"ea8ff5b8d9927e070696e454242cb25215da6f52953bbb2556cff399c9571832"} Apr 22 19:24:39.246510 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.246243 2579 scope.go:117] "RemoveContainer" containerID="81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84" Apr 22 19:24:39.248137 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.248116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerStarted","Data":"98f545a0693afe8502de914c94c8f47259c8e1a631867d9a003434b0ba5a02e1"} Apr 22 19:24:39.248253 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.248144 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerStarted","Data":"9ab216e291df310f627c498f95a84cfc8f44e3ec76644d3ffea368f5a1763dd4"} Apr 22 19:24:39.254745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.252585 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a827ea3a-a2b7-41fa-a963-8858cb40eb51-isvc-predictive-lightgbm-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:24:39.254745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.252614 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a827ea3a-a2b7-41fa-a963-8858cb40eb51-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:24:39.254745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.252638 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:24:39.254745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.252652 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r75lt\" (UniqueName: \"kubernetes.io/projected/a827ea3a-a2b7-41fa-a963-8858cb40eb51-kube-api-access-r75lt\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:24:39.259842 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.259824 2579 scope.go:117] "RemoveContainer" containerID="69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f" Apr 22 19:24:39.266613 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.266594 2579 scope.go:117] "RemoveContainer" containerID="23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699" Apr 22 19:24:39.273195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273180 2579 scope.go:117] "RemoveContainer" containerID="81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84" Apr 22 19:24:39.273472 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:24:39.273453 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84\": container with ID starting with 81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84 not found: ID does not exist" containerID="81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84" Apr 22 19:24:39.273515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273479 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84"} err="failed to get container status \"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84\": rpc error: code = NotFound desc = could not find container \"81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84\": container with ID starting with 81d7293454d708a60376ef14771da092e81c5c1dc2814f36e729153553ccfe84 not found: ID does not exist" Apr 22 19:24:39.273515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273495 2579 scope.go:117] "RemoveContainer" containerID="69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f" Apr 22 19:24:39.273688 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:24:39.273669 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f\": container with ID starting with 69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f not found: ID does not exist" containerID="69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f" Apr 22 19:24:39.273724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273695 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f"} err="failed to get container status \"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f\": rpc error: code = NotFound desc = could not find container \"69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f\": container with ID starting with 69c2753ad80350c1541c6e22ce6d86d2489b4c95dab7f090f6a07a9cb384ed2f not found: ID does not exist" Apr 22 19:24:39.273724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273711 2579 scope.go:117] "RemoveContainer" containerID="23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699" Apr 22 19:24:39.273911 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:24:39.273898 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699\": container with ID starting with 23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699 not found: ID does not exist" containerID="23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699" Apr 22 19:24:39.273948 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.273913 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699"} err="failed to get container status \"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699\": rpc error: code = NotFound desc = could not find container \"23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699\": container with ID starting with 23d6aaaa60e87365a3fabec98b21126391ff0a9ac1691be5429a9788ca46f699 not found: ID does not exist" Apr 22 19:24:39.277433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.277397 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podStartSLOduration=6.277388106 podStartE2EDuration="6.277388106s" podCreationTimestamp="2026-04-22 19:24:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:24:39.276415971 +0000 UTC m=+2291.275989880" watchObservedRunningTime="2026-04-22 19:24:39.277388106 +0000 UTC m=+2291.276962003" Apr 22 19:24:39.291825 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.291801 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:24:39.297480 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:39.297457 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-75cb94f9f-nw97b"] Apr 22 19:24:40.543217 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:40.543183 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" path="/var/lib/kubelet/pods/a827ea3a-a2b7-41fa-a963-8858cb40eb51/volumes" Apr 22 19:24:44.249002 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:44.248964 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:44.249433 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:44.249413 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:24:44.253690 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:24:44.253662 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:25:15.268872 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:25:15.268827 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 22 19:25:25.268777 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:25:25.268728 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 22 19:25:35.268172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:25:35.268127 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 22 19:25:45.268320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:25:45.268255 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 22 19:25:55.271417 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:25:55.271387 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:26:04.057099 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.057047 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:26:04.057645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.057489 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" containerID="cri-o://9ab216e291df310f627c498f95a84cfc8f44e3ec76644d3ffea368f5a1763dd4" gracePeriod=30 Apr 22 19:26:04.057645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.057546 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kube-rbac-proxy" containerID="cri-o://98f545a0693afe8502de914c94c8f47259c8e1a631867d9a003434b0ba5a02e1" gracePeriod=30 Apr 22 19:26:04.209294 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209248 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:26:04.209579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209553 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="storage-initializer" Apr 22 19:26:04.209579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209569 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="storage-initializer" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209598 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209607 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209617 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kube-rbac-proxy" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209622 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kube-rbac-proxy" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209684 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kube-rbac-proxy" Apr 22 19:26:04.209747 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.209694 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a827ea3a-a2b7-41fa-a963-8858cb40eb51" containerName="kserve-container" Apr 22 19:26:04.212525 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.212508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.216301 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.216280 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 22 19:26:04.216760 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.216744 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-xgboost-v2-predictor-serving-cert\"" Apr 22 19:26:04.227622 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.227599 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:26:04.254129 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.254089 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.43:8643/healthz\": dial tcp 10.132.0.43:8643: connect: connection refused" Apr 22 19:26:04.347651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.347564 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.347651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.347624 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.347852 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.347671 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.347852 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.347747 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dn9cg\" (UniqueName: \"kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449108 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449108 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449125 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dn9cg\" (UniqueName: \"kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449641 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449619 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.449863 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.449839 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.451555 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.451527 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.459699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.459679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dn9cg\" (UniqueName: \"kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg\") pod \"isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.485448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.485410 2579 generic.go:358] "Generic (PLEG): container finished" podID="dae1c729-2093-4df1-9810-09136190db35" containerID="98f545a0693afe8502de914c94c8f47259c8e1a631867d9a003434b0ba5a02e1" exitCode=2 Apr 22 19:26:04.485581 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.485458 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerDied","Data":"98f545a0693afe8502de914c94c8f47259c8e1a631867d9a003434b0ba5a02e1"} Apr 22 19:26:04.522322 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.522299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:04.648738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:04.648537 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:26:04.651314 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:26:04.651286 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea910eed_2eb2_4a0f_a3c2_d152aaf91a9c.slice/crio-08dcacbfaf9cbb95cd3cb317aa938b0d4495d27fe5368aaaed6d7ca324124432 WatchSource:0}: Error finding container 08dcacbfaf9cbb95cd3cb317aa938b0d4495d27fe5368aaaed6d7ca324124432: Status 404 returned error can't find the container with id 08dcacbfaf9cbb95cd3cb317aa938b0d4495d27fe5368aaaed6d7ca324124432 Apr 22 19:26:05.268381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:05.268335 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.43:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.43:8080: connect: connection refused" Apr 22 19:26:05.488851 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:05.488813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerStarted","Data":"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872"} Apr 22 19:26:05.489009 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:05.488855 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerStarted","Data":"08dcacbfaf9cbb95cd3cb317aa938b0d4495d27fe5368aaaed6d7ca324124432"} Apr 22 19:26:08.499216 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.499185 2579 generic.go:358] "Generic (PLEG): container finished" podID="dae1c729-2093-4df1-9810-09136190db35" containerID="9ab216e291df310f627c498f95a84cfc8f44e3ec76644d3ffea368f5a1763dd4" exitCode=0 Apr 22 19:26:08.499589 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.499279 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerDied","Data":"9ab216e291df310f627c498f95a84cfc8f44e3ec76644d3ffea368f5a1763dd4"} Apr 22 19:26:08.500498 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.500472 2579 generic.go:358] "Generic (PLEG): container finished" podID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerID="69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872" exitCode=0 Apr 22 19:26:08.500600 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.500515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerDied","Data":"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872"} Apr 22 19:26:08.606684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.606663 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:26:08.685494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685462 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"dae1c729-2093-4df1-9810-09136190db35\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " Apr 22 19:26:08.685663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685517 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls\") pod \"dae1c729-2093-4df1-9810-09136190db35\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " Apr 22 19:26:08.685663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685549 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbkvx\" (UniqueName: \"kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx\") pod \"dae1c729-2093-4df1-9810-09136190db35\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " Apr 22 19:26:08.685663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685604 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location\") pod \"dae1c729-2093-4df1-9810-09136190db35\" (UID: \"dae1c729-2093-4df1-9810-09136190db35\") " Apr 22 19:26:08.685962 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685931 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config") pod "dae1c729-2093-4df1-9810-09136190db35" (UID: "dae1c729-2093-4df1-9810-09136190db35"). InnerVolumeSpecName "isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:26:08.686052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.685963 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dae1c729-2093-4df1-9810-09136190db35" (UID: "dae1c729-2093-4df1-9810-09136190db35"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:26:08.687656 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.687634 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "dae1c729-2093-4df1-9810-09136190db35" (UID: "dae1c729-2093-4df1-9810-09136190db35"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:26:08.687723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.687677 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx" (OuterVolumeSpecName: "kube-api-access-hbkvx") pod "dae1c729-2093-4df1-9810-09136190db35" (UID: "dae1c729-2093-4df1-9810-09136190db35"). InnerVolumeSpecName "kube-api-access-hbkvx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:26:08.786887 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.786848 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hbkvx\" (UniqueName: \"kubernetes.io/projected/dae1c729-2093-4df1-9810-09136190db35-kube-api-access-hbkvx\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:26:08.786887 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.786881 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dae1c729-2093-4df1-9810-09136190db35-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:26:08.786887 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.786892 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/dae1c729-2093-4df1-9810-09136190db35-isvc-predictive-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:26:08.787124 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:08.786903 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/dae1c729-2093-4df1-9810-09136190db35-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:26:09.505164 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.505128 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" event={"ID":"dae1c729-2093-4df1-9810-09136190db35","Type":"ContainerDied","Data":"b3bb02a26728ce4bfda5968d11bd391f6fca4cf54a06fec98dedf6b2761506b7"} Apr 22 19:26:09.505164 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.505161 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm" Apr 22 19:26:09.505739 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.505175 2579 scope.go:117] "RemoveContainer" containerID="98f545a0693afe8502de914c94c8f47259c8e1a631867d9a003434b0ba5a02e1" Apr 22 19:26:09.507279 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.507224 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerStarted","Data":"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60"} Apr 22 19:26:09.507393 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.507323 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerStarted","Data":"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae"} Apr 22 19:26:09.507572 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.507552 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:09.513536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.513518 2579 scope.go:117] "RemoveContainer" containerID="9ab216e291df310f627c498f95a84cfc8f44e3ec76644d3ffea368f5a1763dd4" Apr 22 19:26:09.520690 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.520675 2579 scope.go:117] "RemoveContainer" containerID="8a3636d58e597b2937fee14a70e5278833a35354346f1235c9284595ca230895" Apr 22 19:26:09.533582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.533543 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podStartSLOduration=5.533532699 podStartE2EDuration="5.533532699s" podCreationTimestamp="2026-04-22 19:26:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:26:09.531304199 +0000 UTC m=+2381.530878097" watchObservedRunningTime="2026-04-22 19:26:09.533532699 +0000 UTC m=+2381.533106596" Apr 22 19:26:09.547698 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.547665 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:26:09.555695 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:09.555674 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-b5d4f6b79-7rlkm"] Apr 22 19:26:10.511275 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:10.511234 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:10.543611 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:10.543580 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dae1c729-2093-4df1-9810-09136190db35" path="/var/lib/kubelet/pods/dae1c729-2093-4df1-9810-09136190db35/volumes" Apr 22 19:26:16.521299 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:16.521254 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:26:46.521885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:46.521798 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 22 19:26:56.522533 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:26:56.522496 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 22 19:27:06.522495 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:06.522455 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 22 19:27:16.522503 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:16.522466 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.44:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.44:8080: connect: connection refused" Apr 22 19:27:22.543301 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:22.543243 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:27:24.268582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.268544 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:27:24.268998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.268820 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" containerID="cri-o://70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae" gracePeriod=30 Apr 22 19:27:24.268998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.268864 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kube-rbac-proxy" containerID="cri-o://ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60" gracePeriod=30 Apr 22 19:27:24.371791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.371758 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:27:24.372117 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372099 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372119 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372141 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="storage-initializer" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372147 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="storage-initializer" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372157 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kube-rbac-proxy" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372162 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kube-rbac-proxy" Apr 22 19:27:24.372214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372213 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kserve-container" Apr 22 19:27:24.372474 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.372224 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="dae1c729-2093-4df1-9810-09136190db35" containerName="kube-rbac-proxy" Apr 22 19:27:24.375370 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.375338 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.378021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.377998 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-predictor-serving-cert\"" Apr 22 19:27:24.378146 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.378057 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\"" Apr 22 19:27:24.386295 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.386241 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:27:24.458422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.458373 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.458607 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.458476 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.458607 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.458533 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.458687 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.458645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5bd7\" (UniqueName: \"kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.559715 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.559626 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.559715 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.559687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5bd7\" (UniqueName: \"kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.559951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.559746 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.559951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.559776 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.559951 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:27:24.559796 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-serving-cert: secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 22 19:27:24.559951 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:27:24.559867 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls podName:94b1e8d9-3893-4fa7-9877-0f781402276d nodeName:}" failed. No retries permitted until 2026-04-22 19:27:25.059846223 +0000 UTC m=+2457.059420097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls") pod "isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" (UID: "94b1e8d9-3893-4fa7-9877-0f781402276d") : secret "isvc-predictive-lightgbm-v2-predictor-serving-cert" not found Apr 22 19:27:24.560211 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.560189 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.560447 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.560427 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.570066 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.570043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5bd7\" (UniqueName: \"kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:24.714736 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.714703 2579 generic.go:358] "Generic (PLEG): container finished" podID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerID="ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60" exitCode=2 Apr 22 19:27:24.714910 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:24.714781 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerDied","Data":"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60"} Apr 22 19:27:25.064023 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.063961 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:25.066468 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.066437 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") pod \"isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:25.285988 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.285949 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:25.428504 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.428351 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:27:25.431109 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:27:25.431080 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94b1e8d9_3893_4fa7_9877_0f781402276d.slice/crio-d53a948b60fcdcef194ffe07649af13331d01ab40dffda09b41cbfe7dcd4db30 WatchSource:0}: Error finding container d53a948b60fcdcef194ffe07649af13331d01ab40dffda09b41cbfe7dcd4db30: Status 404 returned error can't find the container with id d53a948b60fcdcef194ffe07649af13331d01ab40dffda09b41cbfe7dcd4db30 Apr 22 19:27:25.719192 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.719154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerStarted","Data":"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a"} Apr 22 19:27:25.719379 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:25.719198 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerStarted","Data":"d53a948b60fcdcef194ffe07649af13331d01ab40dffda09b41cbfe7dcd4db30"} Apr 22 19:27:26.516337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:26.516287 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.44:8643/healthz\": dial tcp 10.132.0.44:8643: connect: connection refused" Apr 22 19:27:29.006569 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.006548 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:27:29.098121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098023 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " Apr 22 19:27:29.098121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098103 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls\") pod \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " Apr 22 19:27:29.098121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098125 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location\") pod \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " Apr 22 19:27:29.098462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098146 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dn9cg\" (UniqueName: \"kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg\") pod \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\" (UID: \"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c\") " Apr 22 19:27:29.098462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098421 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config") pod "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" (UID: "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c"). InnerVolumeSpecName "isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:27:29.098462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.098454 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" (UID: "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:27:29.100365 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.100343 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" (UID: "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:27:29.100515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.100410 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg" (OuterVolumeSpecName: "kube-api-access-dn9cg") pod "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" (UID: "ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c"). InnerVolumeSpecName "kube-api-access-dn9cg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:27:29.198930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.198900 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:27:29.198930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.198929 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:27:29.199107 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.198938 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dn9cg\" (UniqueName: \"kubernetes.io/projected/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-kube-api-access-dn9cg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:27:29.199107 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.198950 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c-isvc-predictive-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:27:29.738177 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.738138 2579 generic.go:358] "Generic (PLEG): container finished" podID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerID="70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae" exitCode=0 Apr 22 19:27:29.738374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.738218 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerDied","Data":"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae"} Apr 22 19:27:29.738374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.738238 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" Apr 22 19:27:29.738374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.738287 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7" event={"ID":"ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c","Type":"ContainerDied","Data":"08dcacbfaf9cbb95cd3cb317aa938b0d4495d27fe5368aaaed6d7ca324124432"} Apr 22 19:27:29.738374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.738312 2579 scope.go:117] "RemoveContainer" containerID="ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60" Apr 22 19:27:29.739569 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.739550 2579 generic.go:358] "Generic (PLEG): container finished" podID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerID="6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a" exitCode=0 Apr 22 19:27:29.739697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.739589 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerDied","Data":"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a"} Apr 22 19:27:29.746709 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.746690 2579 scope.go:117] "RemoveContainer" containerID="70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae" Apr 22 19:27:29.753680 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.753656 2579 scope.go:117] "RemoveContainer" containerID="69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872" Apr 22 19:27:29.762349 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.762329 2579 scope.go:117] "RemoveContainer" containerID="ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60" Apr 22 19:27:29.762656 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:27:29.762631 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60\": container with ID starting with ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60 not found: ID does not exist" containerID="ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60" Apr 22 19:27:29.762745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.762663 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60"} err="failed to get container status \"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60\": rpc error: code = NotFound desc = could not find container \"ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60\": container with ID starting with ac39ccbb0ac64a7d4b0163c57456939205ffde7a009c3eb81631c55ddfa52a60 not found: ID does not exist" Apr 22 19:27:29.762745 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.762681 2579 scope.go:117] "RemoveContainer" containerID="70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae" Apr 22 19:27:29.762926 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:27:29.762906 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae\": container with ID starting with 70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae not found: ID does not exist" containerID="70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae" Apr 22 19:27:29.762990 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.762940 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae"} err="failed to get container status \"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae\": rpc error: code = NotFound desc = could not find container \"70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae\": container with ID starting with 70e7c44277438921647baf0a606d7f44f2fa552c2c3258ff2f380f204c7598ae not found: ID does not exist" Apr 22 19:27:29.762990 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.762957 2579 scope.go:117] "RemoveContainer" containerID="69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872" Apr 22 19:27:29.763181 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:27:29.763165 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872\": container with ID starting with 69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872 not found: ID does not exist" containerID="69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872" Apr 22 19:27:29.763238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.763185 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872"} err="failed to get container status \"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872\": rpc error: code = NotFound desc = could not find container \"69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872\": container with ID starting with 69d39dc481006fdbcdd41a96c7a7e718c27c3e3ba28f75cd9a1cb5a1bf17e872 not found: ID does not exist" Apr 22 19:27:29.781167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.781136 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:27:29.786141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:29.786120 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-6577c65fd8-tgjh7"] Apr 22 19:27:30.543678 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.543644 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" path="/var/lib/kubelet/pods/ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c/volumes" Apr 22 19:27:30.744104 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.744073 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerStarted","Data":"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c"} Apr 22 19:27:30.744104 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.744112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerStarted","Data":"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4"} Apr 22 19:27:30.744375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.744317 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:30.744431 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.744384 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:27:30.768926 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:30.768875 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podStartSLOduration=6.768861682 podStartE2EDuration="6.768861682s" podCreationTimestamp="2026-04-22 19:27:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:27:30.766648683 +0000 UTC m=+2462.766222617" watchObservedRunningTime="2026-04-22 19:27:30.768861682 +0000 UTC m=+2462.768435578" Apr 22 19:27:36.750744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:27:36.750708 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:28:06.751851 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:06.751804 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 22 19:28:16.752312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:16.752199 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 22 19:28:26.752123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:26.752082 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 22 19:28:36.751434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:36.751393 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 22 19:28:46.755194 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:46.755167 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:28:54.467411 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:54.467381 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:28:54.467818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:54.467664 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" containerID="cri-o://f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4" gracePeriod=30 Apr 22 19:28:54.467818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:54.467724 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kube-rbac-proxy" containerID="cri-o://5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c" gracePeriod=30 Apr 22 19:28:54.977980 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:54.977944 2579 generic.go:358] "Generic (PLEG): container finished" podID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerID="5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c" exitCode=2 Apr 22 19:28:54.978165 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:54.978018 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerDied","Data":"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c"} Apr 22 19:28:56.675855 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.675822 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676087 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676097 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676111 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="storage-initializer" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676118 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="storage-initializer" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676126 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kube-rbac-proxy" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676131 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kube-rbac-proxy" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676180 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kube-rbac-proxy" Apr 22 19:28:56.676231 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.676188 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="ea910eed-2eb2-4a0f-a3c2-d152aaf91a9c" containerName="kserve-container" Apr 22 19:28:56.679094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.679073 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.683229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.683203 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-predictor-serving-cert\"" Apr 22 19:28:56.683229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.683223 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-kube-rbac-proxy-sar-config\"" Apr 22 19:28:56.693095 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.693065 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:28:56.747557 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.747513 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.45:8643/healthz\": dial tcp 10.132.0.45:8643: connect: connection refused" Apr 22 19:28:56.751940 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.751916 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.45:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.45:8080: connect: connection refused" Apr 22 19:28:56.754773 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.754751 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.754859 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.754787 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.754900 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.754847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r74l\" (UniqueName: \"kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.754900 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.754893 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.855889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.855842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.856090 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.855906 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.856090 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.855932 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.856090 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.855965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r74l\" (UniqueName: \"kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.856416 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.856390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.856649 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.856628 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.858492 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.858475 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.865452 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.865418 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r74l\" (UniqueName: \"kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l\") pod \"isvc-sklearn-predictor-d8dbfbbb9-vp7dp\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:56.988530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:56.988483 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:28:57.125736 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:57.125652 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:28:57.128086 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:28:57.128059 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd59008_9be1_4900_9126_d96c1eec477b.slice/crio-f8f02b503b21883c2d497a138946415b1517646aaebda4e2af312a6aa68c029a WatchSource:0}: Error finding container f8f02b503b21883c2d497a138946415b1517646aaebda4e2af312a6aa68c029a: Status 404 returned error can't find the container with id f8f02b503b21883c2d497a138946415b1517646aaebda4e2af312a6aa68c029a Apr 22 19:28:57.129934 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:57.129917 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:28:57.987128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:57.987084 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerStarted","Data":"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9"} Apr 22 19:28:57.987128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:57.987131 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerStarted","Data":"f8f02b503b21883c2d497a138946415b1517646aaebda4e2af312a6aa68c029a"} Apr 22 19:28:59.698202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.698178 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:28:59.772966 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.772928 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") pod \"94b1e8d9-3893-4fa7-9877-0f781402276d\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " Apr 22 19:28:59.773149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.772983 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location\") pod \"94b1e8d9-3893-4fa7-9877-0f781402276d\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " Apr 22 19:28:59.773149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.773014 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5bd7\" (UniqueName: \"kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7\") pod \"94b1e8d9-3893-4fa7-9877-0f781402276d\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " Apr 22 19:28:59.773149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.773052 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") pod \"94b1e8d9-3893-4fa7-9877-0f781402276d\" (UID: \"94b1e8d9-3893-4fa7-9877-0f781402276d\") " Apr 22 19:28:59.773422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.773391 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "94b1e8d9-3893-4fa7-9877-0f781402276d" (UID: "94b1e8d9-3893-4fa7-9877-0f781402276d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:28:59.773521 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.773495 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config") pod "94b1e8d9-3893-4fa7-9877-0f781402276d" (UID: "94b1e8d9-3893-4fa7-9877-0f781402276d"). InnerVolumeSpecName "isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:28:59.775090 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.775069 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7" (OuterVolumeSpecName: "kube-api-access-l5bd7") pod "94b1e8d9-3893-4fa7-9877-0f781402276d" (UID: "94b1e8d9-3893-4fa7-9877-0f781402276d"). InnerVolumeSpecName "kube-api-access-l5bd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:28:59.775090 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.775079 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "94b1e8d9-3893-4fa7-9877-0f781402276d" (UID: "94b1e8d9-3893-4fa7-9877-0f781402276d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:28:59.873817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.873727 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l5bd7\" (UniqueName: \"kubernetes.io/projected/94b1e8d9-3893-4fa7-9877-0f781402276d-kube-api-access-l5bd7\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:28:59.873817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.873758 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/94b1e8d9-3893-4fa7-9877-0f781402276d-isvc-predictive-lightgbm-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:28:59.873817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.873769 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/94b1e8d9-3893-4fa7-9877-0f781402276d-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:28:59.873817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.873780 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/94b1e8d9-3893-4fa7-9877-0f781402276d-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:28:59.993320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.993285 2579 generic.go:358] "Generic (PLEG): container finished" podID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerID="f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4" exitCode=0 Apr 22 19:28:59.993484 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.993368 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" Apr 22 19:28:59.993484 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.993368 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerDied","Data":"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4"} Apr 22 19:28:59.993484 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.993413 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7" event={"ID":"94b1e8d9-3893-4fa7-9877-0f781402276d","Type":"ContainerDied","Data":"d53a948b60fcdcef194ffe07649af13331d01ab40dffda09b41cbfe7dcd4db30"} Apr 22 19:28:59.993484 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:28:59.993436 2579 scope.go:117] "RemoveContainer" containerID="5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c" Apr 22 19:29:00.001771 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.001751 2579 scope.go:117] "RemoveContainer" containerID="f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4" Apr 22 19:29:00.008970 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.008950 2579 scope.go:117] "RemoveContainer" containerID="6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a" Apr 22 19:29:00.017030 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.017006 2579 scope.go:117] "RemoveContainer" containerID="5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c" Apr 22 19:29:00.017421 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:29:00.017396 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c\": container with ID starting with 5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c not found: ID does not exist" containerID="5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c" Apr 22 19:29:00.017540 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.017435 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c"} err="failed to get container status \"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c\": rpc error: code = NotFound desc = could not find container \"5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c\": container with ID starting with 5b31077c7f6393d57120483dc455ffc0c325cb389d68b614470dea402748e16c not found: ID does not exist" Apr 22 19:29:00.017540 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.017454 2579 scope.go:117] "RemoveContainer" containerID="f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4" Apr 22 19:29:00.017742 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:29:00.017721 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4\": container with ID starting with f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4 not found: ID does not exist" containerID="f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4" Apr 22 19:29:00.017813 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.017752 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4"} err="failed to get container status \"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4\": rpc error: code = NotFound desc = could not find container \"f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4\": container with ID starting with f087d3f1b75d49bad7f43a89c9ac3dc5c5e6df64aad4343bcc40cc13cb24eff4 not found: ID does not exist" Apr 22 19:29:00.017813 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.017772 2579 scope.go:117] "RemoveContainer" containerID="6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a" Apr 22 19:29:00.018071 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:29:00.018054 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a\": container with ID starting with 6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a not found: ID does not exist" containerID="6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a" Apr 22 19:29:00.018131 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.018077 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a"} err="failed to get container status \"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a\": rpc error: code = NotFound desc = could not find container \"6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a\": container with ID starting with 6ded0f13b9e5158ddef0337139242961717e770f7ade6d9f9c4e437f0270860a not found: ID does not exist" Apr 22 19:29:00.018438 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.018413 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:29:00.021334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.021311 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-865b4598f7-rc8k7"] Apr 22 19:29:00.542601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:00.542570 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" path="/var/lib/kubelet/pods/94b1e8d9-3893-4fa7-9877-0f781402276d/volumes" Apr 22 19:29:02.000901 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:02.000856 2579 generic.go:358] "Generic (PLEG): container finished" podID="3dd59008-9be1-4900-9126-d96c1eec477b" containerID="9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9" exitCode=0 Apr 22 19:29:02.001409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:02.000919 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerDied","Data":"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9"} Apr 22 19:29:03.005792 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.005758 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerStarted","Data":"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900"} Apr 22 19:29:03.006212 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.005799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerStarted","Data":"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c"} Apr 22 19:29:03.006212 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.006175 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:29:03.006344 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.006326 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:29:03.007638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.007612 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:03.025432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:03.025382 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podStartSLOduration=7.025366912 podStartE2EDuration="7.025366912s" podCreationTimestamp="2026-04-22 19:28:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:29:03.024487551 +0000 UTC m=+2555.024061460" watchObservedRunningTime="2026-04-22 19:29:03.025366912 +0000 UTC m=+2555.024940805" Apr 22 19:29:04.009115 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:04.009082 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:09.013314 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:09.013280 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:29:09.013871 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:09.013748 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:19.014122 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:19.014083 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:29.014180 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:29.014142 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:39.013957 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:39.013917 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:49.013854 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:49.013775 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:29:59.014169 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:29:59.014135 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:30:09.015014 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:09.014977 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:30:16.784202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.784165 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:30:16.784681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.784499 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" containerID="cri-o://9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c" gracePeriod=30 Apr 22 19:30:16.784681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.784561 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kube-rbac-proxy" containerID="cri-o://477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900" gracePeriod=30 Apr 22 19:30:16.851604 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851567 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:30:16.851839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851827 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kube-rbac-proxy" Apr 22 19:30:16.851894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851840 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kube-rbac-proxy" Apr 22 19:30:16.851894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851850 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="storage-initializer" Apr 22 19:30:16.851894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851856 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="storage-initializer" Apr 22 19:30:16.851894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851878 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" Apr 22 19:30:16.851894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851884 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" Apr 22 19:30:16.852053 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851927 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kserve-container" Apr 22 19:30:16.852053 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.851937 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="94b1e8d9-3893-4fa7-9877-0f781402276d" containerName="kube-rbac-proxy" Apr 22 19:30:16.858498 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.858471 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:16.861676 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.861649 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-predictor-serving-cert\"" Apr 22 19:30:16.861811 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.861706 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 22 19:30:16.866009 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.865987 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:30:16.960521 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.960485 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:16.960691 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.960536 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:16.960691 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.960651 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:16.960691 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:16.960684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061144 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061181 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061214 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061498 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061241 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061498 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:30:17.061366 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-serving-cert: secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 22 19:30:17.061498 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:30:17.061449 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls podName:73300898-93ad-4277-8c6b-812df4cf43e5 nodeName:}" failed. No retries permitted until 2026-04-22 19:30:17.561427537 +0000 UTC m=+2629.561001426 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls") pod "sklearn-v2-mlserver-predictor-65d8664766-t7jfx" (UID: "73300898-93ad-4277-8c6b-812df4cf43e5") : secret "sklearn-v2-mlserver-predictor-serving-cert" not found Apr 22 19:30:17.061670 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061649 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.061886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.061867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.081628 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.081606 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.209930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.209898 2579 generic.go:358] "Generic (PLEG): container finished" podID="3dd59008-9be1-4900-9126-d96c1eec477b" containerID="477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900" exitCode=2 Apr 22 19:30:17.210077 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.209977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerDied","Data":"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900"} Apr 22 19:30:17.564993 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.564956 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.567398 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.567375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") pod \"sklearn-v2-mlserver-predictor-65d8664766-t7jfx\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.769907 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.769871 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:17.889338 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:17.889061 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:30:17.891815 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:30:17.891790 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73300898_93ad_4277_8c6b_812df4cf43e5.slice/crio-b10d5e2861fb92fe09383b712f31e624945c17967acc1ee70d0f4a2bd2e3e54e WatchSource:0}: Error finding container b10d5e2861fb92fe09383b712f31e624945c17967acc1ee70d0f4a2bd2e3e54e: Status 404 returned error can't find the container with id b10d5e2861fb92fe09383b712f31e624945c17967acc1ee70d0f4a2bd2e3e54e Apr 22 19:30:18.214202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:18.214166 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerStarted","Data":"6c93f14af76629d1f9dd586bee637ae86f201836beb906f87b237410baa92785"} Apr 22 19:30:18.214202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:18.214203 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerStarted","Data":"b10d5e2861fb92fe09383b712f31e624945c17967acc1ee70d0f4a2bd2e3e54e"} Apr 22 19:30:19.009930 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:19.009887 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.46:8643/healthz\": dial tcp 10.132.0.46:8643: connect: connection refused" Apr 22 19:30:19.014278 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:19.014240 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 22 19:30:21.120810 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.120790 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:30:21.190885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.190801 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config\") pod \"3dd59008-9be1-4900-9126-d96c1eec477b\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " Apr 22 19:30:21.190885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.190872 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r74l\" (UniqueName: \"kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l\") pod \"3dd59008-9be1-4900-9126-d96c1eec477b\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " Apr 22 19:30:21.191078 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.191040 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls\") pod \"3dd59008-9be1-4900-9126-d96c1eec477b\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " Apr 22 19:30:21.191078 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.191074 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location\") pod \"3dd59008-9be1-4900-9126-d96c1eec477b\" (UID: \"3dd59008-9be1-4900-9126-d96c1eec477b\") " Apr 22 19:30:21.191238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.191207 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-kube-rbac-proxy-sar-config") pod "3dd59008-9be1-4900-9126-d96c1eec477b" (UID: "3dd59008-9be1-4900-9126-d96c1eec477b"). InnerVolumeSpecName "isvc-sklearn-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:30:21.191481 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.191458 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3dd59008-9be1-4900-9126-d96c1eec477b" (UID: "3dd59008-9be1-4900-9126-d96c1eec477b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:30:21.192936 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.192915 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l" (OuterVolumeSpecName: "kube-api-access-4r74l") pod "3dd59008-9be1-4900-9126-d96c1eec477b" (UID: "3dd59008-9be1-4900-9126-d96c1eec477b"). InnerVolumeSpecName "kube-api-access-4r74l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:30:21.193027 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.193003 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "3dd59008-9be1-4900-9126-d96c1eec477b" (UID: "3dd59008-9be1-4900-9126-d96c1eec477b"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:30:21.224334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.224303 2579 generic.go:358] "Generic (PLEG): container finished" podID="3dd59008-9be1-4900-9126-d96c1eec477b" containerID="9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c" exitCode=0 Apr 22 19:30:21.224465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.224382 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" Apr 22 19:30:21.224465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.224387 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerDied","Data":"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c"} Apr 22 19:30:21.224465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.224427 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp" event={"ID":"3dd59008-9be1-4900-9126-d96c1eec477b","Type":"ContainerDied","Data":"f8f02b503b21883c2d497a138946415b1517646aaebda4e2af312a6aa68c029a"} Apr 22 19:30:21.224465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.224443 2579 scope.go:117] "RemoveContainer" containerID="477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900" Apr 22 19:30:21.232289 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.232252 2579 scope.go:117] "RemoveContainer" containerID="9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c" Apr 22 19:30:21.239078 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.239059 2579 scope.go:117] "RemoveContainer" containerID="9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9" Apr 22 19:30:21.246576 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.246407 2579 scope.go:117] "RemoveContainer" containerID="477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900" Apr 22 19:30:21.246717 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:30:21.246689 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900\": container with ID starting with 477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900 not found: ID does not exist" containerID="477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900" Apr 22 19:30:21.246779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.246729 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900"} err="failed to get container status \"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900\": rpc error: code = NotFound desc = could not find container \"477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900\": container with ID starting with 477ccf2eaa3ad921aae58e45d498d94f810311c6271a084093e26eb990e79900 not found: ID does not exist" Apr 22 19:30:21.246779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.246752 2579 scope.go:117] "RemoveContainer" containerID="9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c" Apr 22 19:30:21.247023 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:30:21.247006 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c\": container with ID starting with 9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c not found: ID does not exist" containerID="9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c" Apr 22 19:30:21.247088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.247028 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c"} err="failed to get container status \"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c\": rpc error: code = NotFound desc = could not find container \"9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c\": container with ID starting with 9531f465ef86fe2ae1930c15e2f65f6588c43b261626e4c08dff5bf2cb030a4c not found: ID does not exist" Apr 22 19:30:21.247088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.247048 2579 scope.go:117] "RemoveContainer" containerID="9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9" Apr 22 19:30:21.247309 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:30:21.247285 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9\": container with ID starting with 9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9 not found: ID does not exist" containerID="9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9" Apr 22 19:30:21.247378 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.247319 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9"} err="failed to get container status \"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9\": rpc error: code = NotFound desc = could not find container \"9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9\": container with ID starting with 9f6c5012ce2a1167fd7f9874c6ab2daed81692c5672b17d0bcd0e925c815c8e9 not found: ID does not exist" Apr 22 19:30:21.248238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.248222 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:30:21.251609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.251590 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-d8dbfbbb9-vp7dp"] Apr 22 19:30:21.292404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.292377 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/3dd59008-9be1-4900-9126-d96c1eec477b-isvc-sklearn-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:30:21.292404 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.292405 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4r74l\" (UniqueName: \"kubernetes.io/projected/3dd59008-9be1-4900-9126-d96c1eec477b-kube-api-access-4r74l\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:30:21.292557 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.292418 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/3dd59008-9be1-4900-9126-d96c1eec477b-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:30:21.292557 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:21.292427 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3dd59008-9be1-4900-9126-d96c1eec477b-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:30:22.228153 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:22.228060 2579 generic.go:358] "Generic (PLEG): container finished" podID="73300898-93ad-4277-8c6b-812df4cf43e5" containerID="6c93f14af76629d1f9dd586bee637ae86f201836beb906f87b237410baa92785" exitCode=0 Apr 22 19:30:22.228564 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:22.228145 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerDied","Data":"6c93f14af76629d1f9dd586bee637ae86f201836beb906f87b237410baa92785"} Apr 22 19:30:22.542824 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:22.542744 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" path="/var/lib/kubelet/pods/3dd59008-9be1-4900-9126-d96c1eec477b/volumes" Apr 22 19:30:23.234080 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:23.234046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerStarted","Data":"4068380ebc17dfe083c47c1935c9b42c64ac3345458b2308bfbc9bdc6698a98d"} Apr 22 19:30:23.234465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:23.234089 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerStarted","Data":"9f4ab4d1b6d38d42875eda2d9c31dc68cf2a3b6a95d2f2f647a7ab3e59d75834"} Apr 22 19:30:23.234465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:23.234304 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:23.256003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:23.255959 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" podStartSLOduration=7.255946889 podStartE2EDuration="7.255946889s" podCreationTimestamp="2026-04-22 19:30:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:30:23.254702396 +0000 UTC m=+2635.254276304" watchObservedRunningTime="2026-04-22 19:30:23.255946889 +0000 UTC m=+2635.255520785" Apr 22 19:30:24.236704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:24.236673 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:30:30.245357 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:30:30.245322 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:31:00.334813 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:00.334769 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 19:31:10.248653 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:10.248620 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:31:17.043548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.043463 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:31:17.044003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.043781 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kserve-container" containerID="cri-o://9f4ab4d1b6d38d42875eda2d9c31dc68cf2a3b6a95d2f2f647a7ab3e59d75834" gracePeriod=30 Apr 22 19:31:17.044003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.043825 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kube-rbac-proxy" containerID="cri-o://4068380ebc17dfe083c47c1935c9b42c64ac3345458b2308bfbc9bdc6698a98d" gracePeriod=30 Apr 22 19:31:17.092145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092115 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:17.092409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092396 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kube-rbac-proxy" Apr 22 19:31:17.092465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092411 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kube-rbac-proxy" Apr 22 19:31:17.092465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092427 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="storage-initializer" Apr 22 19:31:17.092465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092432 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="storage-initializer" Apr 22 19:31:17.092465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092445 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" Apr 22 19:31:17.092465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092451 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" Apr 22 19:31:17.092625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092499 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kserve-container" Apr 22 19:31:17.092625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.092507 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="3dd59008-9be1-4900-9126-d96c1eec477b" containerName="kube-rbac-proxy" Apr 22 19:31:17.095596 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.095577 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.110337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.110311 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:31:17.110594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.110579 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-runtime-predictor-serving-cert\"" Apr 22 19:31:17.129629 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.129605 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:17.242508 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.242469 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.242683 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.242516 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.242683 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.242610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpw76\" (UniqueName: \"kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.242683 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.242663 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.343886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.343794 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.343886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.343852 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.344120 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.343895 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.344120 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.343939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpw76\" (UniqueName: \"kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.344320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.344296 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.344545 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.344525 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.346374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.346353 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.359825 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.359800 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpw76\" (UniqueName: \"kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76\") pod \"isvc-sklearn-runtime-predictor-65cd49579f-tkgq9\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.389811 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.389786 2579 generic.go:358] "Generic (PLEG): container finished" podID="73300898-93ad-4277-8c6b-812df4cf43e5" containerID="4068380ebc17dfe083c47c1935c9b42c64ac3345458b2308bfbc9bdc6698a98d" exitCode=2 Apr 22 19:31:17.389936 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.389866 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerDied","Data":"4068380ebc17dfe083c47c1935c9b42c64ac3345458b2308bfbc9bdc6698a98d"} Apr 22 19:31:17.405011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.404991 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:17.539311 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:17.539115 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:17.541843 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:31:17.541813 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode578d619_1224_41f0_bb6c_fb7e2d4f04bb.slice/crio-15f96f22fbe3ebc8a1fd3dd6d2d6649d3af06698a1edea6294f713750eb7b91e WatchSource:0}: Error finding container 15f96f22fbe3ebc8a1fd3dd6d2d6649d3af06698a1edea6294f713750eb7b91e: Status 404 returned error can't find the container with id 15f96f22fbe3ebc8a1fd3dd6d2d6649d3af06698a1edea6294f713750eb7b91e Apr 22 19:31:18.393983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:18.393944 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerStarted","Data":"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908"} Apr 22 19:31:18.393983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:18.393988 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerStarted","Data":"15f96f22fbe3ebc8a1fd3dd6d2d6649d3af06698a1edea6294f713750eb7b91e"} Apr 22 19:31:20.240891 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:20.240842 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.47:8643/healthz\": dial tcp 10.132.0.47:8643: connect: connection refused" Apr 22 19:31:23.408426 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:23.408392 2579 generic.go:358] "Generic (PLEG): container finished" podID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerID="b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908" exitCode=0 Apr 22 19:31:23.408784 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:23.408473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerDied","Data":"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908"} Apr 22 19:31:24.413834 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.413794 2579 generic.go:358] "Generic (PLEG): container finished" podID="73300898-93ad-4277-8c6b-812df4cf43e5" containerID="9f4ab4d1b6d38d42875eda2d9c31dc68cf2a3b6a95d2f2f647a7ab3e59d75834" exitCode=0 Apr 22 19:31:24.414249 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.413879 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerDied","Data":"9f4ab4d1b6d38d42875eda2d9c31dc68cf2a3b6a95d2f2f647a7ab3e59d75834"} Apr 22 19:31:24.415840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.415813 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerStarted","Data":"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff"} Apr 22 19:31:24.415960 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.415852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerStarted","Data":"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c"} Apr 22 19:31:24.416072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.416052 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:24.436887 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.436826 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podStartSLOduration=7.436806588 podStartE2EDuration="7.436806588s" podCreationTimestamp="2026-04-22 19:31:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:31:24.436161725 +0000 UTC m=+2696.435735636" watchObservedRunningTime="2026-04-22 19:31:24.436806588 +0000 UTC m=+2696.436380488" Apr 22 19:31:24.493953 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.493926 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:31:24.598363 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.598252 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl\") pod \"73300898-93ad-4277-8c6b-812df4cf43e5\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " Apr 22 19:31:24.598363 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.598328 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"73300898-93ad-4277-8c6b-812df4cf43e5\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " Apr 22 19:31:24.598619 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.598369 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location\") pod \"73300898-93ad-4277-8c6b-812df4cf43e5\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " Apr 22 19:31:24.598619 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.598395 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") pod \"73300898-93ad-4277-8c6b-812df4cf43e5\" (UID: \"73300898-93ad-4277-8c6b-812df4cf43e5\") " Apr 22 19:31:24.598730 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.598688 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "sklearn-v2-mlserver-kube-rbac-proxy-sar-config") pod "73300898-93ad-4277-8c6b-812df4cf43e5" (UID: "73300898-93ad-4277-8c6b-812df4cf43e5"). InnerVolumeSpecName "sklearn-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:31:24.599128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.599097 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "73300898-93ad-4277-8c6b-812df4cf43e5" (UID: "73300898-93ad-4277-8c6b-812df4cf43e5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:31:24.600982 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.600959 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl" (OuterVolumeSpecName: "kube-api-access-jl8tl") pod "73300898-93ad-4277-8c6b-812df4cf43e5" (UID: "73300898-93ad-4277-8c6b-812df4cf43e5"). InnerVolumeSpecName "kube-api-access-jl8tl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:31:24.601929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.601910 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "73300898-93ad-4277-8c6b-812df4cf43e5" (UID: "73300898-93ad-4277-8c6b-812df4cf43e5"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:31:24.699174 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.699138 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jl8tl\" (UniqueName: \"kubernetes.io/projected/73300898-93ad-4277-8c6b-812df4cf43e5-kube-api-access-jl8tl\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:24.699174 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.699167 2579 reconciler_common.go:299] "Volume detached for volume \"sklearn-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/73300898-93ad-4277-8c6b-812df4cf43e5-sklearn-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:24.699174 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.699177 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/73300898-93ad-4277-8c6b-812df4cf43e5-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:24.699441 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:24.699188 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73300898-93ad-4277-8c6b-812df4cf43e5-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:25.420532 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.420487 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" event={"ID":"73300898-93ad-4277-8c6b-812df4cf43e5","Type":"ContainerDied","Data":"b10d5e2861fb92fe09383b712f31e624945c17967acc1ee70d0f4a2bd2e3e54e"} Apr 22 19:31:25.421033 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.420548 2579 scope.go:117] "RemoveContainer" containerID="4068380ebc17dfe083c47c1935c9b42c64ac3345458b2308bfbc9bdc6698a98d" Apr 22 19:31:25.421033 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.420500 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx" Apr 22 19:31:25.421033 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.420652 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:25.422437 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.422407 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 22 19:31:25.429677 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.429250 2579 scope.go:117] "RemoveContainer" containerID="9f4ab4d1b6d38d42875eda2d9c31dc68cf2a3b6a95d2f2f647a7ab3e59d75834" Apr 22 19:31:25.437760 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.437729 2579 scope.go:117] "RemoveContainer" containerID="6c93f14af76629d1f9dd586bee637ae86f201836beb906f87b237410baa92785" Apr 22 19:31:25.446655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.446620 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:31:25.450164 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:25.450135 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-65d8664766-t7jfx"] Apr 22 19:31:26.430304 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:26.430241 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 22 19:31:26.543139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:26.543105 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" path="/var/lib/kubelet/pods/73300898-93ad-4277-8c6b-812df4cf43e5/volumes" Apr 22 19:31:31.435037 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:31.435009 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:31.435478 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:31.435450 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 22 19:31:41.435904 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:41.435873 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:54.000528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.000499 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-65cd49579f-tkgq9_e578d619-1224-41f0-bb6c-fb7e2d4f04bb/kserve-container/0.log" Apr 22 19:31:54.246025 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.245991 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:31:54.246307 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246295 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kube-rbac-proxy" Apr 22 19:31:54.246364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246308 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kube-rbac-proxy" Apr 22 19:31:54.246364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246323 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kserve-container" Apr 22 19:31:54.246364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246329 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kserve-container" Apr 22 19:31:54.246364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246339 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="storage-initializer" Apr 22 19:31:54.246364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246345 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="storage-initializer" Apr 22 19:31:54.246517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246393 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kserve-container" Apr 22 19:31:54.246517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.246402 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="73300898-93ad-4277-8c6b-812df4cf43e5" containerName="kube-rbac-proxy" Apr 22 19:31:54.249502 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.249487 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.252381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.252325 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-predictor-serving-cert\"" Apr 22 19:31:54.252381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.252330 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:31:54.272620 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.272593 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:31:54.313026 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.312996 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:54.313334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.313309 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" containerID="cri-o://466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c" gracePeriod=30 Apr 22 19:31:54.313429 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.313341 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kube-rbac-proxy" containerID="cri-o://316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff" gracePeriod=30 Apr 22 19:31:54.329113 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.329092 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.329203 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.329133 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.329203 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.329154 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khpvg\" (UniqueName: \"kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.329325 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.329224 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.430417 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.430389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.430542 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.430429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khpvg\" (UniqueName: \"kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.430542 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.430471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.430542 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.430509 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.430715 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:31:54.430628 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-serving-cert: secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 22 19:31:54.430715 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:31:54.430714 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls podName:6f46c78f-f15f-49b1-b197-0bee31570959 nodeName:}" failed. No retries permitted until 2026-04-22 19:31:54.930692552 +0000 UTC m=+2726.930266431 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls") pod "isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" (UID: "6f46c78f-f15f-49b1-b197-0bee31570959") : secret "isvc-sklearn-v2-runtime-predictor-serving-cert" not found Apr 22 19:31:54.430951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.430930 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.431168 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.431146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.440393 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.440369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khpvg\" (UniqueName: \"kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.515062 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.514965 2579 generic.go:358] "Generic (PLEG): container finished" podID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerID="316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff" exitCode=2 Apr 22 19:31:54.515062 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.515040 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerDied","Data":"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff"} Apr 22 19:31:54.935476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.935442 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:54.938106 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:54.938077 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") pod \"isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:55.138252 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.138229 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:55.159674 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.159642 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:31:55.237779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.237749 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kpw76\" (UniqueName: \"kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76\") pod \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " Apr 22 19:31:55.237987 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.237789 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") pod \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " Apr 22 19:31:55.237987 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.237922 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls\") pod \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " Apr 22 19:31:55.238152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.237989 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location\") pod \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\" (UID: \"e578d619-1224-41f0-bb6c-fb7e2d4f04bb\") " Apr 22 19:31:55.238232 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.238150 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-runtime-kube-rbac-proxy-sar-config") pod "e578d619-1224-41f0-bb6c-fb7e2d4f04bb" (UID: "e578d619-1224-41f0-bb6c-fb7e2d4f04bb"). InnerVolumeSpecName "isvc-sklearn-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:31:55.238413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.238291 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-isvc-sklearn-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:55.242794 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.242770 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76" (OuterVolumeSpecName: "kube-api-access-kpw76") pod "e578d619-1224-41f0-bb6c-fb7e2d4f04bb" (UID: "e578d619-1224-41f0-bb6c-fb7e2d4f04bb"). InnerVolumeSpecName "kube-api-access-kpw76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:31:55.242921 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.242785 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e578d619-1224-41f0-bb6c-fb7e2d4f04bb" (UID: "e578d619-1224-41f0-bb6c-fb7e2d4f04bb"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:31:55.264779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.264745 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e578d619-1224-41f0-bb6c-fb7e2d4f04bb" (UID: "e578d619-1224-41f0-bb6c-fb7e2d4f04bb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:31:55.289890 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.289858 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:31:55.293456 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:31:55.293432 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f46c78f_f15f_49b1_b197_0bee31570959.slice/crio-a1f09f9a16d08acdf43232ce97f07383c0233b6b0c27f1b7bd135ef71b6d2ef6 WatchSource:0}: Error finding container a1f09f9a16d08acdf43232ce97f07383c0233b6b0c27f1b7bd135ef71b6d2ef6: Status 404 returned error can't find the container with id a1f09f9a16d08acdf43232ce97f07383c0233b6b0c27f1b7bd135ef71b6d2ef6 Apr 22 19:31:55.339418 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.339393 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:55.339418 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.339419 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:55.339552 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.339430 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kpw76\" (UniqueName: \"kubernetes.io/projected/e578d619-1224-41f0-bb6c-fb7e2d4f04bb-kube-api-access-kpw76\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:31:55.519623 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.519524 2579 generic.go:358] "Generic (PLEG): container finished" podID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerID="466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c" exitCode=0 Apr 22 19:31:55.519623 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.519596 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerDied","Data":"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c"} Apr 22 19:31:55.519623 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.519616 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" Apr 22 19:31:55.519892 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.519636 2579 scope.go:117] "RemoveContainer" containerID="316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff" Apr 22 19:31:55.519892 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.519626 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9" event={"ID":"e578d619-1224-41f0-bb6c-fb7e2d4f04bb","Type":"ContainerDied","Data":"15f96f22fbe3ebc8a1fd3dd6d2d6649d3af06698a1edea6294f713750eb7b91e"} Apr 22 19:31:55.521045 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.521017 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerStarted","Data":"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27"} Apr 22 19:31:55.521174 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.521051 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerStarted","Data":"a1f09f9a16d08acdf43232ce97f07383c0233b6b0c27f1b7bd135ef71b6d2ef6"} Apr 22 19:31:55.527887 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.527770 2579 scope.go:117] "RemoveContainer" containerID="466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c" Apr 22 19:31:55.534966 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.534948 2579 scope.go:117] "RemoveContainer" containerID="b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908" Apr 22 19:31:55.541724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.541705 2579 scope.go:117] "RemoveContainer" containerID="316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff" Apr 22 19:31:55.542001 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:31:55.541982 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff\": container with ID starting with 316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff not found: ID does not exist" containerID="316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff" Apr 22 19:31:55.542081 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.542015 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff"} err="failed to get container status \"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff\": rpc error: code = NotFound desc = could not find container \"316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff\": container with ID starting with 316c03825f36bba80c6a4deb8e53f9678032b57a8bcbb3e969ca2e96b9dcd6ff not found: ID does not exist" Apr 22 19:31:55.542081 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.542040 2579 scope.go:117] "RemoveContainer" containerID="466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c" Apr 22 19:31:55.542299 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:31:55.542280 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c\": container with ID starting with 466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c not found: ID does not exist" containerID="466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c" Apr 22 19:31:55.542339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.542307 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c"} err="failed to get container status \"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c\": rpc error: code = NotFound desc = could not find container \"466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c\": container with ID starting with 466090f534c813def2cacbd0f44f1fbdcddbfe16b318f9cc375e152155cf682c not found: ID does not exist" Apr 22 19:31:55.542339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.542321 2579 scope.go:117] "RemoveContainer" containerID="b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908" Apr 22 19:31:55.542520 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:31:55.542495 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908\": container with ID starting with b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908 not found: ID does not exist" containerID="b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908" Apr 22 19:31:55.542581 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.542530 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908"} err="failed to get container status \"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908\": rpc error: code = NotFound desc = could not find container \"b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908\": container with ID starting with b33efeeca6c1d11156bf6fd46b52f3e2f906b5c57a940bbb5f1f59c8617eb908 not found: ID does not exist" Apr 22 19:31:55.558423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.558399 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:55.563196 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:55.563170 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-65cd49579f-tkgq9"] Apr 22 19:31:56.543277 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:56.543233 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" path="/var/lib/kubelet/pods/e578d619-1224-41f0-bb6c-fb7e2d4f04bb/volumes" Apr 22 19:31:59.533094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:59.533059 2579 generic.go:358] "Generic (PLEG): container finished" podID="6f46c78f-f15f-49b1-b197-0bee31570959" containerID="05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27" exitCode=0 Apr 22 19:31:59.533491 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:31:59.533123 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerDied","Data":"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27"} Apr 22 19:32:00.541772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:00.541740 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerStarted","Data":"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78"} Apr 22 19:32:00.541772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:00.541774 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerStarted","Data":"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be"} Apr 22 19:32:00.542233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:00.541962 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:32:00.542233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:00.542017 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:32:00.567728 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:00.567670 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podStartSLOduration=6.567654199 podStartE2EDuration="6.567654199s" podCreationTimestamp="2026-04-22 19:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:32:00.56472981 +0000 UTC m=+2732.564303710" watchObservedRunningTime="2026-04-22 19:32:00.567654199 +0000 UTC m=+2732.567228098" Apr 22 19:32:06.546545 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:06.546516 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:32:36.634530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:36.634483 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 19:32:46.549166 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:46.549095 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:32:54.323434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.323402 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:32:54.323942 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.323721 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" containerID="cri-o://f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be" gracePeriod=30 Apr 22 19:32:54.323942 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.323756 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" containerID="cri-o://0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78" gracePeriod=30 Apr 22 19:32:54.394494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394464 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:32:54.394733 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394721 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kube-rbac-proxy" Apr 22 19:32:54.394791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394735 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kube-rbac-proxy" Apr 22 19:32:54.394791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394745 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" Apr 22 19:32:54.394791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394751 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" Apr 22 19:32:54.394791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394768 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="storage-initializer" Apr 22 19:32:54.394791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394774 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="storage-initializer" Apr 22 19:32:54.394970 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394819 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kserve-container" Apr 22 19:32:54.394970 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.394827 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e578d619-1224-41f0-bb6c-fb7e2d4f04bb" containerName="kube-rbac-proxy" Apr 22 19:32:54.397755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.397738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.400810 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.400791 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-kube-rbac-proxy-sar-config\"" Apr 22 19:32:54.400950 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.400932 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-predictor-serving-cert\"" Apr 22 19:32:54.408524 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.408497 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:32:54.479248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.479220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.479419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.479271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.479419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.479356 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.479497 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.479442 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pmrr\" (UniqueName: \"kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.580886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.580791 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pmrr\" (UniqueName: \"kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.580886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.580832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.580886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.580857 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.580886 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.580880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.581448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.581417 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.581643 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.581626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.583369 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.583349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.590028 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.590007 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pmrr\" (UniqueName: \"kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr\") pod \"isvc-sklearn-v2-predictor-69755fbb9-mvnht\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.683655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.683622 2579 generic.go:358] "Generic (PLEG): container finished" podID="6f46c78f-f15f-49b1-b197-0bee31570959" containerID="0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78" exitCode=2 Apr 22 19:32:54.683816 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.683698 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerDied","Data":"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78"} Apr 22 19:32:54.707352 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.707330 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:54.829684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:54.829661 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:32:54.831488 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:32:54.831427 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2724aba3_32bd_4710_bdb0_5cd6844e1033.slice/crio-c045c8689e54ffb706b19eab53b4423e69a856acf26db8c1be5fb741a68cb3bf WatchSource:0}: Error finding container c045c8689e54ffb706b19eab53b4423e69a856acf26db8c1be5fb741a68cb3bf: Status 404 returned error can't find the container with id c045c8689e54ffb706b19eab53b4423e69a856acf26db8c1be5fb741a68cb3bf Apr 22 19:32:55.687522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:55.687482 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerStarted","Data":"5a397a1d2c1812a0c745ef5ba2b88689e516178be48cae99a85e48c9e5a1fa8f"} Apr 22 19:32:55.687522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:55.687526 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerStarted","Data":"c045c8689e54ffb706b19eab53b4423e69a856acf26db8c1be5fb741a68cb3bf"} Apr 22 19:32:56.542074 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:56.542030 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": dial tcp 10.132.0.49:8643: connect: connection refused" Apr 22 19:32:57.589454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:57.589405 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-sklearn-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 22 19:32:58.696597 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:58.696567 2579 generic.go:358] "Generic (PLEG): container finished" podID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerID="5a397a1d2c1812a0c745ef5ba2b88689e516178be48cae99a85e48c9e5a1fa8f" exitCode=0 Apr 22 19:32:58.696974 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:58.696624 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerDied","Data":"5a397a1d2c1812a0c745ef5ba2b88689e516178be48cae99a85e48c9e5a1fa8f"} Apr 22 19:32:59.701155 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.701116 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerStarted","Data":"c38379bd2967f9d4d64761f1c806607216a12e9e973b6ddd8695f4f21be4ee37"} Apr 22 19:32:59.701155 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.701161 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerStarted","Data":"1f5a5d59eb0134073a8d436ecd25c4f89d5324b2979275607f34fb15847f669e"} Apr 22 19:32:59.701599 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.701465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:59.701635 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.701598 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:32:59.702793 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.702771 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:32:59.723864 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:32:59.723815 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podStartSLOduration=5.723802185 podStartE2EDuration="5.723802185s" podCreationTimestamp="2026-04-22 19:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:32:59.721920547 +0000 UTC m=+2791.721494445" watchObservedRunningTime="2026-04-22 19:32:59.723802185 +0000 UTC m=+2791.723376082" Apr 22 19:33:00.703895 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:00.703851 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:01.564560 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.564537 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:33:01.641536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641456 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") pod \"6f46c78f-f15f-49b1-b197-0bee31570959\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " Apr 22 19:33:01.641536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641492 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") pod \"6f46c78f-f15f-49b1-b197-0bee31570959\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " Apr 22 19:33:01.641536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641533 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location\") pod \"6f46c78f-f15f-49b1-b197-0bee31570959\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " Apr 22 19:33:01.641824 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641581 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khpvg\" (UniqueName: \"kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg\") pod \"6f46c78f-f15f-49b1-b197-0bee31570959\" (UID: \"6f46c78f-f15f-49b1-b197-0bee31570959\") " Apr 22 19:33:01.641959 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641917 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config") pod "6f46c78f-f15f-49b1-b197-0bee31570959" (UID: "6f46c78f-f15f-49b1-b197-0bee31570959"). InnerVolumeSpecName "isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:33:01.642073 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.641961 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f46c78f-f15f-49b1-b197-0bee31570959" (UID: "6f46c78f-f15f-49b1-b197-0bee31570959"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:33:01.643655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.643632 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6f46c78f-f15f-49b1-b197-0bee31570959" (UID: "6f46c78f-f15f-49b1-b197-0bee31570959"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:33:01.643738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.643678 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg" (OuterVolumeSpecName: "kube-api-access-khpvg") pod "6f46c78f-f15f-49b1-b197-0bee31570959" (UID: "6f46c78f-f15f-49b1-b197-0bee31570959"). InnerVolumeSpecName "kube-api-access-khpvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:33:01.708392 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.708356 2579 generic.go:358] "Generic (PLEG): container finished" podID="6f46c78f-f15f-49b1-b197-0bee31570959" containerID="f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be" exitCode=0 Apr 22 19:33:01.708779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.708438 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerDied","Data":"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be"} Apr 22 19:33:01.708779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.708458 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" Apr 22 19:33:01.708779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.708470 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" event={"ID":"6f46c78f-f15f-49b1-b197-0bee31570959","Type":"ContainerDied","Data":"a1f09f9a16d08acdf43232ce97f07383c0233b6b0c27f1b7bd135ef71b6d2ef6"} Apr 22 19:33:01.708779 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.708485 2579 scope.go:117] "RemoveContainer" containerID="0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78" Apr 22 19:33:01.716990 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.716849 2579 scope.go:117] "RemoveContainer" containerID="f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be" Apr 22 19:33:01.725735 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.725708 2579 scope.go:117] "RemoveContainer" containerID="05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27" Apr 22 19:33:01.732601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.732582 2579 scope.go:117] "RemoveContainer" containerID="0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78" Apr 22 19:33:01.732846 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:33:01.732828 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78\": container with ID starting with 0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78 not found: ID does not exist" containerID="0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78" Apr 22 19:33:01.732904 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.732855 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78"} err="failed to get container status \"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78\": rpc error: code = NotFound desc = could not find container \"0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78\": container with ID starting with 0d1ce7beb6906138998b3c932fa0f415ed5cb410664062bdfbdb24bd229a1a78 not found: ID does not exist" Apr 22 19:33:01.732904 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.732876 2579 scope.go:117] "RemoveContainer" containerID="f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be" Apr 22 19:33:01.733126 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:33:01.733106 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be\": container with ID starting with f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be not found: ID does not exist" containerID="f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be" Apr 22 19:33:01.733194 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.733155 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be"} err="failed to get container status \"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be\": rpc error: code = NotFound desc = could not find container \"f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be\": container with ID starting with f2548956ef28c6171af578811c15875af12c516cb9e984ca9f64e5a9cd5c19be not found: ID does not exist" Apr 22 19:33:01.733194 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.733180 2579 scope.go:117] "RemoveContainer" containerID="05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27" Apr 22 19:33:01.733491 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:33:01.733440 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27\": container with ID starting with 05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27 not found: ID does not exist" containerID="05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27" Apr 22 19:33:01.733637 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.733532 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27"} err="failed to get container status \"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27\": rpc error: code = NotFound desc = could not find container \"05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27\": container with ID starting with 05590cbbc40a8a6da0badeed02ade422c580bec7f072b33a7ecb69845485ab27 not found: ID does not exist" Apr 22 19:33:01.736075 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.736051 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:33:01.740004 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.739982 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr"] Apr 22 19:33:01.743017 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.742992 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6f46c78f-f15f-49b1-b197-0bee31570959-isvc-sklearn-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:33:01.743092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.743020 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6f46c78f-f15f-49b1-b197-0bee31570959-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:33:01.743092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.743032 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f46c78f-f15f-49b1-b197-0bee31570959-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:33:01.743092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:01.743040 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-khpvg\" (UniqueName: \"kubernetes.io/projected/6f46c78f-f15f-49b1-b197-0bee31570959-kube-api-access-khpvg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:33:02.542913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:02.542882 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" path="/var/lib/kubelet/pods/6f46c78f-f15f-49b1-b197-0bee31570959/volumes" Apr 22 19:33:02.543237 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:02.543215 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-6d84c876f4-qt2hr" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.49:8643/healthz\": context deadline exceeded" Apr 22 19:33:05.707656 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:05.707627 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:33:05.708250 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:05.708224 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:15.708651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:15.708608 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:25.708889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:25.708846 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:35.708287 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:35.708217 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:45.708214 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:45.708172 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:33:55.708197 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:33:55.708154 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:34:05.709446 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:05.709416 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:34:14.613236 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.613203 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:34:14.613638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.613542 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" containerID="cri-o://1f5a5d59eb0134073a8d436ecd25c4f89d5324b2979275607f34fb15847f669e" gracePeriod=30 Apr 22 19:34:14.613638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.613581 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kube-rbac-proxy" containerID="cri-o://c38379bd2967f9d4d64761f1c806607216a12e9e973b6ddd8695f4f21be4ee37" gracePeriod=30 Apr 22 19:34:14.700350 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700316 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:34:14.700594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700582 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" Apr 22 19:34:14.700642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700596 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" Apr 22 19:34:14.700642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700607 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" Apr 22 19:34:14.700642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700612 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" Apr 22 19:34:14.700642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700625 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="storage-initializer" Apr 22 19:34:14.700642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700631 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="storage-initializer" Apr 22 19:34:14.700794 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700675 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kserve-container" Apr 22 19:34:14.700794 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.700684 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f46c78f-f15f-49b1-b197-0bee31570959" containerName="kube-rbac-proxy" Apr 22 19:34:14.703662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.703641 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.708296 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.708272 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-predictor-serving-cert\"" Apr 22 19:34:14.708703 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.708675 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\"" Apr 22 19:34:14.720158 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.720129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:34:14.790234 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.790199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlg6j\" (UniqueName: \"kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.790234 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.790237 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.790457 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.790340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.790457 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.790383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.890985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.890904 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tlg6j\" (UniqueName: \"kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.890985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.890939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.890985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.890969 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.891238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.890994 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.891473 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.891441 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.891671 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.891652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.893395 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.893375 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.902517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.902498 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlg6j\" (UniqueName: \"kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j\") pod \"isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:14.910469 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.910445 2579 generic.go:358] "Generic (PLEG): container finished" podID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerID="c38379bd2967f9d4d64761f1c806607216a12e9e973b6ddd8695f4f21be4ee37" exitCode=2 Apr 22 19:34:14.910548 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:14.910518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerDied","Data":"c38379bd2967f9d4d64761f1c806607216a12e9e973b6ddd8695f4f21be4ee37"} Apr 22 19:34:15.013495 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.013457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:15.134722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.134634 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:34:15.137453 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:34:15.137427 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5b29375_55e1_426c_8820_796cc1c86927.slice/crio-6bf3aa43693bb16a7d4baac041ba69767de36724cecc1a8dc5b269f86723bc01 WatchSource:0}: Error finding container 6bf3aa43693bb16a7d4baac041ba69767de36724cecc1a8dc5b269f86723bc01: Status 404 returned error can't find the container with id 6bf3aa43693bb16a7d4baac041ba69767de36724cecc1a8dc5b269f86723bc01 Apr 22 19:34:15.139547 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.139533 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:34:15.704594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.704556 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.50:8643/healthz\": dial tcp 10.132.0.50:8643: connect: connection refused" Apr 22 19:34:15.708882 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.708857 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.50:8080: connect: connection refused" Apr 22 19:34:15.916474 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.916439 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerStarted","Data":"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2"} Apr 22 19:34:15.916474 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:15.916473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerStarted","Data":"6bf3aa43693bb16a7d4baac041ba69767de36724cecc1a8dc5b269f86723bc01"} Apr 22 19:34:18.925490 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:18.925458 2579 generic.go:358] "Generic (PLEG): container finished" podID="b5b29375-55e1-426c-8820-796cc1c86927" containerID="5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2" exitCode=0 Apr 22 19:34:18.925897 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:18.925535 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerDied","Data":"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2"} Apr 22 19:34:18.927478 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:18.927457 2579 generic.go:358] "Generic (PLEG): container finished" podID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerID="1f5a5d59eb0134073a8d436ecd25c4f89d5324b2979275607f34fb15847f669e" exitCode=0 Apr 22 19:34:18.927585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:18.927517 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerDied","Data":"1f5a5d59eb0134073a8d436ecd25c4f89d5324b2979275607f34fb15847f669e"} Apr 22 19:34:18.976910 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:18.976886 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:34:19.018627 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.018602 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls\") pod \"2724aba3-32bd-4710-bdb0-5cd6844e1033\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " Apr 22 19:34:19.018744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.018659 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location\") pod \"2724aba3-32bd-4710-bdb0-5cd6844e1033\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " Apr 22 19:34:19.018744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.018687 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pmrr\" (UniqueName: \"kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr\") pod \"2724aba3-32bd-4710-bdb0-5cd6844e1033\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " Apr 22 19:34:19.018744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.018737 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") pod \"2724aba3-32bd-4710-bdb0-5cd6844e1033\" (UID: \"2724aba3-32bd-4710-bdb0-5cd6844e1033\") " Apr 22 19:34:19.019050 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.019015 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2724aba3-32bd-4710-bdb0-5cd6844e1033" (UID: "2724aba3-32bd-4710-bdb0-5cd6844e1033"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:34:19.019166 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.019092 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-kube-rbac-proxy-sar-config") pod "2724aba3-32bd-4710-bdb0-5cd6844e1033" (UID: "2724aba3-32bd-4710-bdb0-5cd6844e1033"). InnerVolumeSpecName "isvc-sklearn-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:34:19.020748 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.020723 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr" (OuterVolumeSpecName: "kube-api-access-9pmrr") pod "2724aba3-32bd-4710-bdb0-5cd6844e1033" (UID: "2724aba3-32bd-4710-bdb0-5cd6844e1033"). InnerVolumeSpecName "kube-api-access-9pmrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:34:19.020889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.020797 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "2724aba3-32bd-4710-bdb0-5cd6844e1033" (UID: "2724aba3-32bd-4710-bdb0-5cd6844e1033"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:34:19.120011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.119974 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/2724aba3-32bd-4710-bdb0-5cd6844e1033-isvc-sklearn-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:34:19.120011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.120006 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2724aba3-32bd-4710-bdb0-5cd6844e1033-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:34:19.120011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.120020 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2724aba3-32bd-4710-bdb0-5cd6844e1033-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:34:19.120295 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.120035 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pmrr\" (UniqueName: \"kubernetes.io/projected/2724aba3-32bd-4710-bdb0-5cd6844e1033-kube-api-access-9pmrr\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:34:19.932283 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.932239 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerStarted","Data":"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71"} Apr 22 19:34:19.932766 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.932308 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerStarted","Data":"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770"} Apr 22 19:34:19.932766 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.932537 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:19.934203 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.934175 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" event={"ID":"2724aba3-32bd-4710-bdb0-5cd6844e1033","Type":"ContainerDied","Data":"c045c8689e54ffb706b19eab53b4423e69a856acf26db8c1be5fb741a68cb3bf"} Apr 22 19:34:19.934361 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.934209 2579 scope.go:117] "RemoveContainer" containerID="c38379bd2967f9d4d64761f1c806607216a12e9e973b6ddd8695f4f21be4ee37" Apr 22 19:34:19.934361 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.934250 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht" Apr 22 19:34:19.941489 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.941466 2579 scope.go:117] "RemoveContainer" containerID="1f5a5d59eb0134073a8d436ecd25c4f89d5324b2979275607f34fb15847f669e" Apr 22 19:34:19.948654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.948637 2579 scope.go:117] "RemoveContainer" containerID="5a397a1d2c1812a0c745ef5ba2b88689e516178be48cae99a85e48c9e5a1fa8f" Apr 22 19:34:19.954784 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.954743 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podStartSLOduration=5.954726728 podStartE2EDuration="5.954726728s" podCreationTimestamp="2026-04-22 19:34:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:34:19.953287137 +0000 UTC m=+2871.952861033" watchObservedRunningTime="2026-04-22 19:34:19.954726728 +0000 UTC m=+2871.954300626" Apr 22 19:34:19.967432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.967409 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:34:19.971535 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:19.971514 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-69755fbb9-mvnht"] Apr 22 19:34:20.543423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:20.543393 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" path="/var/lib/kubelet/pods/2724aba3-32bd-4710-bdb0-5cd6844e1033/volumes" Apr 22 19:34:20.938205 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:20.938167 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:20.939320 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:20.939289 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:34:21.940827 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:21.940789 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:34:26.946681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:26.946654 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:34:26.947198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:26.947169 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:34:36.947721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:36.947680 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:34:46.947708 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:46.947667 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:34:56.948010 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:34:56.947967 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:35:06.948002 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:06.947962 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:35:16.947758 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:16.947716 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:35:26.947821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:26.947787 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:35:34.797437 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.797401 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:35:34.798072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.797789 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" containerID="cri-o://c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770" gracePeriod=30 Apr 22 19:35:34.798072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.797855 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kube-rbac-proxy" containerID="cri-o://757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71" gracePeriod=30 Apr 22 19:35:34.920366 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920340 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:35:34.920614 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920603 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kube-rbac-proxy" Apr 22 19:35:34.920662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920616 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kube-rbac-proxy" Apr 22 19:35:34.920662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920625 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" Apr 22 19:35:34.920662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920631 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" Apr 22 19:35:34.920662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920638 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="storage-initializer" Apr 22 19:35:34.920662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920644 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="storage-initializer" Apr 22 19:35:34.920818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920701 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kserve-container" Apr 22 19:35:34.920818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.920714 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="2724aba3-32bd-4710-bdb0-5cd6844e1033" containerName="kube-rbac-proxy" Apr 22 19:35:34.923517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.923501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:34.927125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.927103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-kube-rbac-proxy-sar-config\"" Apr 22 19:35:34.927258 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.927135 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-predictor-serving-cert\"" Apr 22 19:35:34.944753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.944726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:35:34.984152 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.984121 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:34.984319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.984161 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:34.984319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.984191 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:34.984319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:34.984213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkkvw\" (UniqueName: \"kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.084807 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.084729 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.084807 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.084778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.084807 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.084797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkkvw\" (UniqueName: \"kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.085046 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.084843 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.085046 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:35:35.084886 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-predictor-serving-cert: secret "isvc-tensorflow-predictor-serving-cert" not found Apr 22 19:35:35.085046 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:35:35.084969 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls podName:c2a09dd1-9bd5-4e23-a1f2-f94606d0685d nodeName:}" failed. No retries permitted until 2026-04-22 19:35:35.584946508 +0000 UTC m=+2947.584520389 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls") pod "isvc-tensorflow-predictor-6756f669d7-5lk68" (UID: "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d") : secret "isvc-tensorflow-predictor-serving-cert" not found Apr 22 19:35:35.085315 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.085298 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.085473 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.085454 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.101225 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.101203 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkkvw\" (UniqueName: \"kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.134200 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.134175 2579 generic.go:358] "Generic (PLEG): container finished" podID="b5b29375-55e1-426c-8820-796cc1c86927" containerID="757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71" exitCode=2 Apr 22 19:35:35.134335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.134240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerDied","Data":"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71"} Apr 22 19:35:35.589508 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.589471 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.591839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.591820 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") pod \"isvc-tensorflow-predictor-6756f669d7-5lk68\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.835128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.835092 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:35.967615 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:35.967559 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:35:35.971307 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:35:35.971256 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a09dd1_9bd5_4e23_a1f2_f94606d0685d.slice/crio-59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36 WatchSource:0}: Error finding container 59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36: Status 404 returned error can't find the container with id 59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36 Apr 22 19:35:36.138828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:36.138739 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerStarted","Data":"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1"} Apr 22 19:35:36.138828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:36.138779 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerStarted","Data":"59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36"} Apr 22 19:35:36.941827 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:36.941790 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.51:8643/healthz\": dial tcp 10.132.0.51:8643: connect: connection refused" Apr 22 19:35:36.947149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:36.947124 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 22 19:35:39.135724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.135701 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:35:39.147324 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.147293 2579 generic.go:358] "Generic (PLEG): container finished" podID="b5b29375-55e1-426c-8820-796cc1c86927" containerID="c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770" exitCode=0 Apr 22 19:35:39.147451 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.147334 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerDied","Data":"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770"} Apr 22 19:35:39.147451 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.147365 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" event={"ID":"b5b29375-55e1-426c-8820-796cc1c86927","Type":"ContainerDied","Data":"6bf3aa43693bb16a7d4baac041ba69767de36724cecc1a8dc5b269f86723bc01"} Apr 22 19:35:39.147451 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.147367 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp" Apr 22 19:35:39.147451 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.147380 2579 scope.go:117] "RemoveContainer" containerID="757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71" Apr 22 19:35:39.154418 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.154400 2579 scope.go:117] "RemoveContainer" containerID="c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770" Apr 22 19:35:39.162878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.162846 2579 scope.go:117] "RemoveContainer" containerID="5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2" Apr 22 19:35:39.169995 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.169976 2579 scope.go:117] "RemoveContainer" containerID="757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71" Apr 22 19:35:39.170242 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:35:39.170224 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71\": container with ID starting with 757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71 not found: ID does not exist" containerID="757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71" Apr 22 19:35:39.170306 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.170250 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71"} err="failed to get container status \"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71\": rpc error: code = NotFound desc = could not find container \"757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71\": container with ID starting with 757564811ef581ffe40c2d123ab8d31f76120b61436ee4b14657ecf4a8841a71 not found: ID does not exist" Apr 22 19:35:39.170306 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.170288 2579 scope.go:117] "RemoveContainer" containerID="c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770" Apr 22 19:35:39.170506 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:35:39.170484 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770\": container with ID starting with c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770 not found: ID does not exist" containerID="c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770" Apr 22 19:35:39.170579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.170515 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770"} err="failed to get container status \"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770\": rpc error: code = NotFound desc = could not find container \"c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770\": container with ID starting with c9cc61f0d095a434c646450cffdd5d0ef7dacef49073cb90ed943f22c0a5d770 not found: ID does not exist" Apr 22 19:35:39.170579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.170537 2579 scope.go:117] "RemoveContainer" containerID="5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2" Apr 22 19:35:39.170778 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:35:39.170760 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2\": container with ID starting with 5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2 not found: ID does not exist" containerID="5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2" Apr 22 19:35:39.170818 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.170792 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2"} err="failed to get container status \"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2\": rpc error: code = NotFound desc = could not find container \"5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2\": container with ID starting with 5efdeea11e863d8a26af3b926b8c45061050041cc2feb85ce36e134003fcfca2 not found: ID does not exist" Apr 22 19:35:39.211534 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211450 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") pod \"b5b29375-55e1-426c-8820-796cc1c86927\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " Apr 22 19:35:39.211534 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls\") pod \"b5b29375-55e1-426c-8820-796cc1c86927\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " Apr 22 19:35:39.211534 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211533 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlg6j\" (UniqueName: \"kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j\") pod \"b5b29375-55e1-426c-8820-796cc1c86927\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " Apr 22 19:35:39.211880 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211576 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location\") pod \"b5b29375-55e1-426c-8820-796cc1c86927\" (UID: \"b5b29375-55e1-426c-8820-796cc1c86927\") " Apr 22 19:35:39.211950 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211921 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config") pod "b5b29375-55e1-426c-8820-796cc1c86927" (UID: "b5b29375-55e1-426c-8820-796cc1c86927"). InnerVolumeSpecName "isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:35:39.211950 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.211938 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5b29375-55e1-426c-8820-796cc1c86927" (UID: "b5b29375-55e1-426c-8820-796cc1c86927"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:35:39.213620 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.213598 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b5b29375-55e1-426c-8820-796cc1c86927" (UID: "b5b29375-55e1-426c-8820-796cc1c86927"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:35:39.213724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.213696 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j" (OuterVolumeSpecName: "kube-api-access-tlg6j") pod "b5b29375-55e1-426c-8820-796cc1c86927" (UID: "b5b29375-55e1-426c-8820-796cc1c86927"). InnerVolumeSpecName "kube-api-access-tlg6j". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:35:39.312058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.312008 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b5b29375-55e1-426c-8820-796cc1c86927-isvc-sklearn-v2-mixed-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:35:39.312058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.312050 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b5b29375-55e1-426c-8820-796cc1c86927-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:35:39.312058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.312063 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tlg6j\" (UniqueName: \"kubernetes.io/projected/b5b29375-55e1-426c-8820-796cc1c86927-kube-api-access-tlg6j\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:35:39.312058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.312076 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5b29375-55e1-426c-8820-796cc1c86927-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:35:39.473032 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.472962 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:35:39.478612 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:39.478590 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7f8b779bc6-m7nbp"] Apr 22 19:35:40.542812 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:40.542776 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5b29375-55e1-426c-8820-796cc1c86927" path="/var/lib/kubelet/pods/b5b29375-55e1-426c-8820-796cc1c86927/volumes" Apr 22 19:35:41.155005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:41.154974 2579 generic.go:358] "Generic (PLEG): container finished" podID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerID="60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1" exitCode=0 Apr 22 19:35:41.155196 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:41.155051 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerDied","Data":"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1"} Apr 22 19:35:46.173737 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:46.173653 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerStarted","Data":"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f"} Apr 22 19:35:46.173737 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:46.173695 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerStarted","Data":"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4"} Apr 22 19:35:46.174139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:46.173899 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:46.196741 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:46.196694 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podStartSLOduration=8.064143137 podStartE2EDuration="12.196679235s" podCreationTimestamp="2026-04-22 19:35:34 +0000 UTC" firstStartedPulling="2026-04-22 19:35:41.156126783 +0000 UTC m=+2953.155700657" lastFinishedPulling="2026-04-22 19:35:45.28866287 +0000 UTC m=+2957.288236755" observedRunningTime="2026-04-22 19:35:46.195070503 +0000 UTC m=+2958.194644400" watchObservedRunningTime="2026-04-22 19:35:46.196679235 +0000 UTC m=+2958.196253134" Apr 22 19:35:47.177094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:47.177060 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:47.178347 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:47.178320 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 22 19:35:48.180144 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:48.180100 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 22 19:35:53.184782 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:53.184750 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:35:53.185317 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:35:53.185292 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 22 19:36:03.186106 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:03.186066 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:36:15.669420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.669383 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:36:15.669983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.669711 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" containerID="cri-o://48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4" gracePeriod=30 Apr 22 19:36:15.669983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.669738 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" containerID="cri-o://a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f" gracePeriod=30 Apr 22 19:36:15.765770 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.765730 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:36:15.766016 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766003 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" Apr 22 19:36:15.766067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766018 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" Apr 22 19:36:15.766067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766028 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="storage-initializer" Apr 22 19:36:15.766067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766034 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="storage-initializer" Apr 22 19:36:15.766067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766040 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kube-rbac-proxy" Apr 22 19:36:15.766067 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766046 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kube-rbac-proxy" Apr 22 19:36:15.766234 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766090 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kserve-container" Apr 22 19:36:15.766234 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.766099 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5b29375-55e1-426c-8820-796cc1c86927" containerName="kube-rbac-proxy" Apr 22 19:36:15.768945 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.768926 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.771839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.771815 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-predictor-serving-cert\"" Apr 22 19:36:15.771946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.771819 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:36:15.779106 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.779082 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:36:15.883471 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.883438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.883664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.883503 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg6cg\" (UniqueName: \"kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.883664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.883570 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.883664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.883616 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984281 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984229 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984300 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984373 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984465 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984424 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg6cg\" (UniqueName: \"kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984613 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:15.984503 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-serving-cert: secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 22 19:36:15.984613 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:15.984572 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls podName:91a40b16-6ebd-4e53-8e8d-b65042222aec nodeName:}" failed. No retries permitted until 2026-04-22 19:36:16.484557034 +0000 UTC m=+2988.484130910 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls") pod "isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" (UID: "91a40b16-6ebd-4e53-8e8d-b65042222aec") : secret "isvc-tensorflow-runtime-predictor-serving-cert" not found Apr 22 19:36:15.984740 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984724 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.984961 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.984945 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:15.993795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:15.993767 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg6cg\" (UniqueName: \"kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:16.259595 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.259508 2579 generic.go:358] "Generic (PLEG): container finished" podID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerID="a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f" exitCode=2 Apr 22 19:36:16.259595 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.259566 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerDied","Data":"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f"} Apr 22 19:36:16.488304 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.488246 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:16.490666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.490631 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") pod \"isvc-tensorflow-runtime-predictor-8699d78cf-c84s7\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:16.679136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.679036 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:16.801582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:16.801386 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:36:16.804454 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:36:16.804426 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91a40b16_6ebd_4e53_8e8d_b65042222aec.slice/crio-099442fb78a9c1f900873ce6c39e1dd3c9c9a966383b035b2623a1cd849fe794 WatchSource:0}: Error finding container 099442fb78a9c1f900873ce6c39e1dd3c9c9a966383b035b2623a1cd849fe794: Status 404 returned error can't find the container with id 099442fb78a9c1f900873ce6c39e1dd3c9c9a966383b035b2623a1cd849fe794 Apr 22 19:36:17.264101 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:17.264062 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerStarted","Data":"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d"} Apr 22 19:36:17.264101 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:17.264099 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerStarted","Data":"099442fb78a9c1f900873ce6c39e1dd3c9c9a966383b035b2623a1cd849fe794"} Apr 22 19:36:18.181058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:18.181018 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:21.275653 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:21.275624 2579 generic.go:358] "Generic (PLEG): container finished" podID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerID="41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d" exitCode=0 Apr 22 19:36:21.275986 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:21.275692 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerDied","Data":"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d"} Apr 22 19:36:22.281316 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:22.281283 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerStarted","Data":"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f"} Apr 22 19:36:22.281704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:22.281325 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerStarted","Data":"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe"} Apr 22 19:36:22.281704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:22.281528 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:22.300870 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:22.300826 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podStartSLOduration=7.30081103 podStartE2EDuration="7.30081103s" podCreationTimestamp="2026-04-22 19:36:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:36:22.300305047 +0000 UTC m=+2994.299878943" watchObservedRunningTime="2026-04-22 19:36:22.30081103 +0000 UTC m=+2994.300384904" Apr 22 19:36:23.180537 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:23.180478 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:23.286854 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:23.286818 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:23.287993 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:23.287963 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 22 19:36:24.288981 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:24.288943 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 22 19:36:28.181187 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:28.181146 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:28.181655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:28.181255 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:36:29.293697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:29.293667 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:29.294215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:29.294188 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 22 19:36:33.181312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:33.181251 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:38.180693 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:38.180652 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:39.295097 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:39.295066 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:36:43.180998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:43.180956 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.52:8643/healthz\": dial tcp 10.132.0.52:8643: connect: connection refused" Apr 22 19:36:45.695965 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:45.695939 2579 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2a09dd1_9bd5_4e23_a1f2_f94606d0685d.slice/crio-59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36\": RecentStats: unable to find data in memory cache]" Apr 22 19:36:46.312863 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.312838 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:36:46.354201 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.354161 2579 generic.go:358] "Generic (PLEG): container finished" podID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerID="48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4" exitCode=137 Apr 22 19:36:46.354201 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.354201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerDied","Data":"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4"} Apr 22 19:36:46.354425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.354227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" event={"ID":"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d","Type":"ContainerDied","Data":"59c4b50c0b2a9c605910cdc318b99127094e113924581475462a848d1e4bec36"} Apr 22 19:36:46.354425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.354237 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68" Apr 22 19:36:46.354425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.354247 2579 scope.go:117] "RemoveContainer" containerID="a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f" Apr 22 19:36:46.362149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.362132 2579 scope.go:117] "RemoveContainer" containerID="48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4" Apr 22 19:36:46.369322 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.369303 2579 scope.go:117] "RemoveContainer" containerID="60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1" Apr 22 19:36:46.375956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.375933 2579 scope.go:117] "RemoveContainer" containerID="a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f" Apr 22 19:36:46.376205 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:46.376187 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f\": container with ID starting with a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f not found: ID does not exist" containerID="a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f" Apr 22 19:36:46.376364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.376213 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f"} err="failed to get container status \"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f\": rpc error: code = NotFound desc = could not find container \"a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f\": container with ID starting with a749660439a461532c4b987c6a7d75ddfd5912d08ab318e23f1809dbb8de814f not found: ID does not exist" Apr 22 19:36:46.376364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.376232 2579 scope.go:117] "RemoveContainer" containerID="48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4" Apr 22 19:36:46.376483 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:46.376464 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4\": container with ID starting with 48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4 not found: ID does not exist" containerID="48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4" Apr 22 19:36:46.376522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.376491 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4"} err="failed to get container status \"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4\": rpc error: code = NotFound desc = could not find container \"48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4\": container with ID starting with 48681d75e91540d35fdfff301655668464e4bc44e5d84ff7ff4c7fb7109bc6b4 not found: ID does not exist" Apr 22 19:36:46.376522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.376512 2579 scope.go:117] "RemoveContainer" containerID="60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1" Apr 22 19:36:46.376765 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:36:46.376746 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1\": container with ID starting with 60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1 not found: ID does not exist" containerID="60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1" Apr 22 19:36:46.376821 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.376773 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1"} err="failed to get container status \"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1\": rpc error: code = NotFound desc = could not find container \"60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1\": container with ID starting with 60721e0a2e2e57ea0f4b40e806ac23ae433e61cc948458e4614c8b0b6939c6f1 not found: ID does not exist" Apr 22 19:36:46.424715 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.424684 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") pod \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " Apr 22 19:36:46.424715 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.424718 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkkvw\" (UniqueName: \"kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw\") pod \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " Apr 22 19:36:46.424926 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.424760 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location\") pod \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " Apr 22 19:36:46.424926 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.424788 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config\") pod \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\" (UID: \"c2a09dd1-9bd5-4e23-a1f2-f94606d0685d\") " Apr 22 19:36:46.425115 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.425090 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-kube-rbac-proxy-sar-config") pod "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" (UID: "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d"). InnerVolumeSpecName "isvc-tensorflow-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:36:46.426788 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.426760 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" (UID: "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:36:46.426884 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.426836 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw" (OuterVolumeSpecName: "kube-api-access-hkkvw") pod "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" (UID: "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d"). InnerVolumeSpecName "kube-api-access-hkkvw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:36:46.441117 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.441039 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" (UID: "c2a09dd1-9bd5-4e23-a1f2-f94606d0685d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:36:46.525387 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.525350 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:36:46.525387 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.525382 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hkkvw\" (UniqueName: \"kubernetes.io/projected/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kube-api-access-hkkvw\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:36:46.525387 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.525393 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:36:46.525609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.525426 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d-isvc-tensorflow-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:36:46.672225 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.672193 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:36:46.680241 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:46.680215 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-6756f669d7-5lk68"] Apr 22 19:36:48.546592 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:48.546561 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" path="/var/lib/kubelet/pods/c2a09dd1-9bd5-4e23-a1f2-f94606d0685d/volumes" Apr 22 19:36:56.574896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.574864 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:36:56.575455 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.575340 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" containerID="cri-o://5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe" gracePeriod=30 Apr 22 19:36:56.575455 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.575405 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" containerID="cri-o://0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f" gracePeriod=30 Apr 22 19:36:56.655114 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655080 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:36:56.655463 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655448 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="storage-initializer" Apr 22 19:36:56.655528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655466 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="storage-initializer" Apr 22 19:36:56.655528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655474 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" Apr 22 19:36:56.655528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655480 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" Apr 22 19:36:56.655528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655495 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" Apr 22 19:36:56.655528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655501 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" Apr 22 19:36:56.655686 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655541 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kserve-container" Apr 22 19:36:56.655686 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.655551 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2a09dd1-9bd5-4e23-a1f2-f94606d0685d" containerName="kube-rbac-proxy" Apr 22 19:36:56.660253 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.660231 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.663136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.663106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-kube-rbac-proxy-sar-config\"" Apr 22 19:36:56.663258 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.663142 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-triton-predictor-serving-cert\"" Apr 22 19:36:56.668518 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.668493 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:36:56.697496 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.697471 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.697616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.697511 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.697616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.697578 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw8h6\" (UniqueName: \"kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.697616 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.697608 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798100 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798060 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw8h6\" (UniqueName: \"kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798340 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798340 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798340 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798201 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798547 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.798784 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.798764 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.800568 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.800547 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.807530 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.807505 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw8h6\" (UniqueName: \"kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6\") pod \"isvc-triton-predictor-84bb65d94b-d9xvc\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:56.973154 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:56.973121 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:36:57.092681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:57.092625 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:36:57.094834 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:36:57.094802 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13234161_3313_4366_bb0e_c76f3c747ba1.slice/crio-b8b703e68e4e5c2f11ae808acd3f2a221fbabefef42c1b8da678148a9492bbaa WatchSource:0}: Error finding container b8b703e68e4e5c2f11ae808acd3f2a221fbabefef42c1b8da678148a9492bbaa: Status 404 returned error can't find the container with id b8b703e68e4e5c2f11ae808acd3f2a221fbabefef42c1b8da678148a9492bbaa Apr 22 19:36:57.389561 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:57.389465 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerStarted","Data":"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92"} Apr 22 19:36:57.389561 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:57.389515 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerStarted","Data":"b8b703e68e4e5c2f11ae808acd3f2a221fbabefef42c1b8da678148a9492bbaa"} Apr 22 19:36:57.391312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:57.391282 2579 generic.go:358] "Generic (PLEG): container finished" podID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerID="0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f" exitCode=2 Apr 22 19:36:57.391312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:57.391292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerDied","Data":"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f"} Apr 22 19:36:59.289162 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:36:59.289123 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:01.405599 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:01.405561 2579 generic.go:358] "Generic (PLEG): container finished" podID="13234161-3313-4366-bb0e-c76f3c747ba1" containerID="28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92" exitCode=0 Apr 22 19:37:01.405989 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:01.405628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerDied","Data":"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92"} Apr 22 19:37:04.290426 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:04.290010 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:09.290042 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:09.289986 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:09.290538 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:09.290124 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:37:14.290111 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:14.290041 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:19.290091 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:19.290043 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:24.289940 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:24.289889 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.53:8643/healthz\": dial tcp 10.132.0.53:8643: connect: connection refused" Apr 22 19:37:27.267368 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.267340 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:37:27.354544 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.354418 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") pod \"91a40b16-6ebd-4e53-8e8d-b65042222aec\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " Apr 22 19:37:27.354544 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.354522 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") pod \"91a40b16-6ebd-4e53-8e8d-b65042222aec\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " Apr 22 19:37:27.354791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.354563 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg6cg\" (UniqueName: \"kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg\") pod \"91a40b16-6ebd-4e53-8e8d-b65042222aec\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " Apr 22 19:37:27.354791 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.354643 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location\") pod \"91a40b16-6ebd-4e53-8e8d-b65042222aec\" (UID: \"91a40b16-6ebd-4e53-8e8d-b65042222aec\") " Apr 22 19:37:27.355197 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.355163 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config") pod "91a40b16-6ebd-4e53-8e8d-b65042222aec" (UID: "91a40b16-6ebd-4e53-8e8d-b65042222aec"). InnerVolumeSpecName "isvc-tensorflow-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:37:27.359044 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.359013 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91a40b16-6ebd-4e53-8e8d-b65042222aec" (UID: "91a40b16-6ebd-4e53-8e8d-b65042222aec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:37:27.365463 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.365401 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "91a40b16-6ebd-4e53-8e8d-b65042222aec" (UID: "91a40b16-6ebd-4e53-8e8d-b65042222aec"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:37:27.365574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.365492 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg" (OuterVolumeSpecName: "kube-api-access-gg6cg") pod "91a40b16-6ebd-4e53-8e8d-b65042222aec" (UID: "91a40b16-6ebd-4e53-8e8d-b65042222aec"). InnerVolumeSpecName "kube-api-access-gg6cg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:37:27.455642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.455606 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91a40b16-6ebd-4e53-8e8d-b65042222aec-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:37:27.455642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.455641 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/91a40b16-6ebd-4e53-8e8d-b65042222aec-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:37:27.455879 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.455658 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/91a40b16-6ebd-4e53-8e8d-b65042222aec-isvc-tensorflow-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:37:27.455879 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.455704 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gg6cg\" (UniqueName: \"kubernetes.io/projected/91a40b16-6ebd-4e53-8e8d-b65042222aec-kube-api-access-gg6cg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:37:27.527904 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.527860 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" Apr 22 19:37:27.528136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.527768 2579 generic.go:358] "Generic (PLEG): container finished" podID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerID="5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe" exitCode=137 Apr 22 19:37:27.528238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.528135 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerDied","Data":"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe"} Apr 22 19:37:27.528238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.528165 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7" event={"ID":"91a40b16-6ebd-4e53-8e8d-b65042222aec","Type":"ContainerDied","Data":"099442fb78a9c1f900873ce6c39e1dd3c9c9a966383b035b2623a1cd849fe794"} Apr 22 19:37:27.528238 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.528185 2579 scope.go:117] "RemoveContainer" containerID="0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f" Apr 22 19:37:27.541327 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.541304 2579 scope.go:117] "RemoveContainer" containerID="5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe" Apr 22 19:37:27.551741 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.551720 2579 scope.go:117] "RemoveContainer" containerID="41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d" Apr 22 19:37:27.557376 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.557355 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:37:27.560998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.560953 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-8699d78cf-c84s7"] Apr 22 19:37:27.561607 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.561572 2579 scope.go:117] "RemoveContainer" containerID="0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f" Apr 22 19:37:27.562007 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:37:27.561880 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f\": container with ID starting with 0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f not found: ID does not exist" containerID="0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f" Apr 22 19:37:27.562007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.561917 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f"} err="failed to get container status \"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f\": rpc error: code = NotFound desc = could not find container \"0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f\": container with ID starting with 0c1981687fd81eb09292bbc3dd22e821d18e01a7c6f0761ecf8e07966623a75f not found: ID does not exist" Apr 22 19:37:27.562007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.561944 2579 scope.go:117] "RemoveContainer" containerID="5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe" Apr 22 19:37:27.562233 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:37:27.562197 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe\": container with ID starting with 5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe not found: ID does not exist" containerID="5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe" Apr 22 19:37:27.562334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.562241 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe"} err="failed to get container status \"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe\": rpc error: code = NotFound desc = could not find container \"5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe\": container with ID starting with 5d675afc55721e4f22657a01df8d7b1b0f0291d5e0b565dfcb82c73a4d5abefe not found: ID does not exist" Apr 22 19:37:27.562334 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.562258 2579 scope.go:117] "RemoveContainer" containerID="41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d" Apr 22 19:37:27.562571 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:37:27.562544 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d\": container with ID starting with 41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d not found: ID does not exist" containerID="41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d" Apr 22 19:37:27.562699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:27.562580 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d"} err="failed to get container status \"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d\": rpc error: code = NotFound desc = could not find container \"41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d\": container with ID starting with 41a3435942ee669fe8f5a799f3501e1795649e6ddd18ebe27e2cf568bed18b3d not found: ID does not exist" Apr 22 19:37:28.544710 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:37:28.544674 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" path="/var/lib/kubelet/pods/91a40b16-6ebd-4e53-8e8d-b65042222aec/volumes" Apr 22 19:38:54.794979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:54.794937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerStarted","Data":"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0"} Apr 22 19:38:54.794979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:54.794983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerStarted","Data":"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829"} Apr 22 19:38:54.795446 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:54.795120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:38:54.819400 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:54.819354 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" podStartSLOduration=5.825847318 podStartE2EDuration="1m58.819341191s" podCreationTimestamp="2026-04-22 19:36:56 +0000 UTC" firstStartedPulling="2026-04-22 19:37:01.406693954 +0000 UTC m=+3033.406267831" lastFinishedPulling="2026-04-22 19:38:54.400187828 +0000 UTC m=+3146.399761704" observedRunningTime="2026-04-22 19:38:54.816219655 +0000 UTC m=+3146.815793554" watchObservedRunningTime="2026-04-22 19:38:54.819341191 +0000 UTC m=+3146.818915088" Apr 22 19:38:55.797918 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:55.797882 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:38:55.799076 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:55.799040 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 22 19:38:56.800552 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:38:56.800518 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 22 19:39:01.805817 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:01.805787 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:39:01.806610 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:01.806591 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:39:08.313128 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.313095 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:39:08.313633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.313473 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" containerID="cri-o://7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829" gracePeriod=30 Apr 22 19:39:08.313633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.313589 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kube-rbac-proxy" containerID="cri-o://2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0" gracePeriod=30 Apr 22 19:39:08.460395 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460362 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:39:08.460647 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460635 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="storage-initializer" Apr 22 19:39:08.460699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460649 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="storage-initializer" Apr 22 19:39:08.460699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460657 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" Apr 22 19:39:08.460699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460663 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" Apr 22 19:39:08.460699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460687 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" Apr 22 19:39:08.460699 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460693 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" Apr 22 19:39:08.460851 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460769 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kube-rbac-proxy" Apr 22 19:39:08.460851 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.460799 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="91a40b16-6ebd-4e53-8e8d-b65042222aec" containerName="kserve-container" Apr 22 19:39:08.474132 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.474097 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:39:08.474304 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.474291 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.477376 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.477350 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-predictor-serving-cert\"" Apr 22 19:39:08.477528 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.477355 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-kube-rbac-proxy-sar-config\"" Apr 22 19:39:08.606151 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.606057 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.606399 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.606377 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54bl7\" (UniqueName: \"kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.606500 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.606423 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.606574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.606499 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707166 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707242 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707312 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-54bl7\" (UniqueName: \"kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707434 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707348 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707679 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.707951 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.707935 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.709729 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.709710 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.717300 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.717275 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-54bl7\" (UniqueName: \"kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7\") pod \"isvc-xgboost-predictor-8689c4cfcc-5jghd\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.784280 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.784228 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:08.832339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.832309 2579 generic.go:358] "Generic (PLEG): container finished" podID="13234161-3313-4366-bb0e-c76f3c747ba1" containerID="2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0" exitCode=2 Apr 22 19:39:08.832522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.832352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerDied","Data":"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0"} Apr 22 19:39:08.934123 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:08.934097 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:39:08.936638 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:39:08.936612 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod146fb2f9_5fa0_4e53_94c2_b05c5908964f.slice/crio-b4aa0286abe960544cd2bf0ba6b8fc5bb4cb09bef9e996ac439953e4e7249371 WatchSource:0}: Error finding container b4aa0286abe960544cd2bf0ba6b8fc5bb4cb09bef9e996ac439953e4e7249371: Status 404 returned error can't find the container with id b4aa0286abe960544cd2bf0ba6b8fc5bb4cb09bef9e996ac439953e4e7249371 Apr 22 19:39:09.836956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:09.836922 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerStarted","Data":"d3555b1a5283769b55539b2ddebe82d6c2d6d43cdfad6d4eb5bd6f7e4bf5ee1e"} Apr 22 19:39:09.836956 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:09.836958 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerStarted","Data":"b4aa0286abe960544cd2bf0ba6b8fc5bb4cb09bef9e996ac439953e4e7249371"} Apr 22 19:39:11.175442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.175419 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:39:11.224953 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.224880 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location\") pod \"13234161-3313-4366-bb0e-c76f3c747ba1\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " Apr 22 19:39:11.224953 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.224921 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls\") pod \"13234161-3313-4366-bb0e-c76f3c747ba1\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " Apr 22 19:39:11.224953 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.224945 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config\") pod \"13234161-3313-4366-bb0e-c76f3c747ba1\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " Apr 22 19:39:11.225233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.224976 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw8h6\" (UniqueName: \"kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6\") pod \"13234161-3313-4366-bb0e-c76f3c747ba1\" (UID: \"13234161-3313-4366-bb0e-c76f3c747ba1\") " Apr 22 19:39:11.225374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.225349 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-triton-kube-rbac-proxy-sar-config") pod "13234161-3313-4366-bb0e-c76f3c747ba1" (UID: "13234161-3313-4366-bb0e-c76f3c747ba1"). InnerVolumeSpecName "isvc-triton-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:39:11.225432 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.225355 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13234161-3313-4366-bb0e-c76f3c747ba1" (UID: "13234161-3313-4366-bb0e-c76f3c747ba1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:39:11.226946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.226924 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "13234161-3313-4366-bb0e-c76f3c747ba1" (UID: "13234161-3313-4366-bb0e-c76f3c747ba1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:39:11.227054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.227013 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6" (OuterVolumeSpecName: "kube-api-access-vw8h6") pod "13234161-3313-4366-bb0e-c76f3c747ba1" (UID: "13234161-3313-4366-bb0e-c76f3c747ba1"). InnerVolumeSpecName "kube-api-access-vw8h6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:39:11.325651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.325619 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-triton-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/13234161-3313-4366-bb0e-c76f3c747ba1-isvc-triton-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:39:11.325651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.325648 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vw8h6\" (UniqueName: \"kubernetes.io/projected/13234161-3313-4366-bb0e-c76f3c747ba1-kube-api-access-vw8h6\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:39:11.325651 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.325659 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13234161-3313-4366-bb0e-c76f3c747ba1-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:39:11.325891 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.325668 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/13234161-3313-4366-bb0e-c76f3c747ba1-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:39:11.845086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.845046 2579 generic.go:358] "Generic (PLEG): container finished" podID="13234161-3313-4366-bb0e-c76f3c747ba1" containerID="7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829" exitCode=0 Apr 22 19:39:11.845290 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.845100 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerDied","Data":"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829"} Apr 22 19:39:11.845290 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.845130 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" event={"ID":"13234161-3313-4366-bb0e-c76f3c747ba1","Type":"ContainerDied","Data":"b8b703e68e4e5c2f11ae808acd3f2a221fbabefef42c1b8da678148a9492bbaa"} Apr 22 19:39:11.845290 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.845129 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc" Apr 22 19:39:11.845290 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.845198 2579 scope.go:117] "RemoveContainer" containerID="2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0" Apr 22 19:39:11.853096 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.853077 2579 scope.go:117] "RemoveContainer" containerID="7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829" Apr 22 19:39:11.859743 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.859726 2579 scope.go:117] "RemoveContainer" containerID="28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92" Apr 22 19:39:11.866381 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.866312 2579 scope.go:117] "RemoveContainer" containerID="2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0" Apr 22 19:39:11.866549 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:39:11.866532 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0\": container with ID starting with 2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0 not found: ID does not exist" containerID="2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0" Apr 22 19:39:11.866599 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.866556 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0"} err="failed to get container status \"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0\": rpc error: code = NotFound desc = could not find container \"2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0\": container with ID starting with 2017236c26179f56a844d94f427375127a2236e35c0868a350a27d90d4ae6fe0 not found: ID does not exist" Apr 22 19:39:11.866599 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.866571 2579 scope.go:117] "RemoveContainer" containerID="7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829" Apr 22 19:39:11.866763 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:39:11.866746 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829\": container with ID starting with 7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829 not found: ID does not exist" containerID="7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829" Apr 22 19:39:11.866807 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.866767 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829"} err="failed to get container status \"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829\": rpc error: code = NotFound desc = could not find container \"7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829\": container with ID starting with 7942d7fca60d5a0c8d7ebccdb657ac8f0c8814701c5566f7622b20f9719cf829 not found: ID does not exist" Apr 22 19:39:11.866807 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.866779 2579 scope.go:117] "RemoveContainer" containerID="28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92" Apr 22 19:39:11.866991 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:39:11.866968 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92\": container with ID starting with 28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92 not found: ID does not exist" containerID="28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92" Apr 22 19:39:11.867045 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.867002 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92"} err="failed to get container status \"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92\": rpc error: code = NotFound desc = could not find container \"28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92\": container with ID starting with 28078394449dfbe3ba59f3c94cc8ac02956d3b961793bd830c8dc2cdedffce92 not found: ID does not exist" Apr 22 19:39:11.867633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.867614 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:39:11.872030 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:11.872012 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-84bb65d94b-d9xvc"] Apr 22 19:39:12.542870 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:12.542835 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" path="/var/lib/kubelet/pods/13234161-3313-4366-bb0e-c76f3c747ba1/volumes" Apr 22 19:39:13.855652 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:13.855616 2579 generic.go:358] "Generic (PLEG): container finished" podID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerID="d3555b1a5283769b55539b2ddebe82d6c2d6d43cdfad6d4eb5bd6f7e4bf5ee1e" exitCode=0 Apr 22 19:39:13.855652 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:13.855652 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerDied","Data":"d3555b1a5283769b55539b2ddebe82d6c2d6d43cdfad6d4eb5bd6f7e4bf5ee1e"} Apr 22 19:39:32.237842 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:32.237821 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:39:32.914972 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:32.914943 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerStarted","Data":"cba5eaf8d30bd81701f447b590beef33120005a366f1eb06f46e42850e8dadcb"} Apr 22 19:39:32.915172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:32.914978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerStarted","Data":"8f0af972f51a4598163932c662038ecf33348dc8db84ad6187ca79c906a4e3cf"} Apr 22 19:39:32.915239 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:32.915196 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:32.935946 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:32.935905 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podStartSLOduration=6.684087177 podStartE2EDuration="24.935893613s" podCreationTimestamp="2026-04-22 19:39:08 +0000 UTC" firstStartedPulling="2026-04-22 19:39:13.856824983 +0000 UTC m=+3165.856398862" lastFinishedPulling="2026-04-22 19:39:32.108631412 +0000 UTC m=+3184.108205298" observedRunningTime="2026-04-22 19:39:32.934084985 +0000 UTC m=+3184.933658909" watchObservedRunningTime="2026-04-22 19:39:32.935893613 +0000 UTC m=+3184.935467577" Apr 22 19:39:33.918149 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:33.918109 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:33.919278 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:33.919238 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:39:34.920989 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:34.920944 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:39:39.925064 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:39.925035 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:39:39.925698 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:39.925671 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:39:49.926566 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:49.926528 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:39:59.925994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:39:59.925958 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:40:09.926060 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:09.926012 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:40:19.926361 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:19.926321 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:40:29.925632 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:29.925592 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:40:39.926427 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:39.926392 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:40:48.574565 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.574526 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:40:48.574940 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.574868 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" containerID="cri-o://8f0af972f51a4598163932c662038ecf33348dc8db84ad6187ca79c906a4e3cf" gracePeriod=30 Apr 22 19:40:48.575007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.574915 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kube-rbac-proxy" containerID="cri-o://cba5eaf8d30bd81701f447b590beef33120005a366f1eb06f46e42850e8dadcb" gracePeriod=30 Apr 22 19:40:48.663892 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.663859 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:40:48.664133 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664122 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="storage-initializer" Apr 22 19:40:48.664188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664135 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="storage-initializer" Apr 22 19:40:48.664188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664142 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" Apr 22 19:40:48.664188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664149 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" Apr 22 19:40:48.664188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664165 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kube-rbac-proxy" Apr 22 19:40:48.664188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664172 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kube-rbac-proxy" Apr 22 19:40:48.664373 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664215 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kserve-container" Apr 22 19:40:48.664373 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.664224 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="13234161-3313-4366-bb0e-c76f3c747ba1" containerName="kube-rbac-proxy" Apr 22 19:40:48.667133 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.667113 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.670027 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.670009 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-predictor-serving-cert\"" Apr 22 19:40:48.670108 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.670076 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 22 19:40:48.677218 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.677196 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:40:48.784862 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.784827 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.785019 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.784882 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.785019 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.784945 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.785019 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.784977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s7wg\" (UniqueName: \"kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.885797 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.885714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.885797 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.885763 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.886037 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.885878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.886037 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.885912 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2s7wg\" (UniqueName: \"kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.886037 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:40:48.885995 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-serving-cert: secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 22 19:40:48.886211 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:40:48.886067 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls podName:314f65e5-754a-48bb-bc81-2204e42c55e7 nodeName:}" failed. No retries permitted until 2026-04-22 19:40:49.386051956 +0000 UTC m=+3261.385625835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls") pod "isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" (UID: "314f65e5-754a-48bb-bc81-2204e42c55e7") : secret "isvc-xgboost-v2-mlserver-predictor-serving-cert" not found Apr 22 19:40:48.886211 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.886138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.886497 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.886476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:48.895246 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:48.895218 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2s7wg\" (UniqueName: \"kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:49.127161 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.127131 2579 generic.go:358] "Generic (PLEG): container finished" podID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerID="cba5eaf8d30bd81701f447b590beef33120005a366f1eb06f46e42850e8dadcb" exitCode=2 Apr 22 19:40:49.127339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.127201 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerDied","Data":"cba5eaf8d30bd81701f447b590beef33120005a366f1eb06f46e42850e8dadcb"} Apr 22 19:40:49.391426 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.391389 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:49.393872 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.393848 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") pod \"isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:49.577879 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.577845 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:49.700849 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.700822 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:40:49.703170 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:40:49.703141 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod314f65e5_754a_48bb_bc81_2204e42c55e7.slice/crio-def6b7d3a67c92fa562cc6b22dd49661cc77e87d6f9b02d982fc3fff90611f6d WatchSource:0}: Error finding container def6b7d3a67c92fa562cc6b22dd49661cc77e87d6f9b02d982fc3fff90611f6d: Status 404 returned error can't find the container with id def6b7d3a67c92fa562cc6b22dd49661cc77e87d6f9b02d982fc3fff90611f6d Apr 22 19:40:49.921393 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.921291 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.55:8643/healthz\": dial tcp 10.132.0.55:8643: connect: connection refused" Apr 22 19:40:49.925605 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:49.925571 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 22 19:40:50.132141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:50.132103 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerStarted","Data":"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58"} Apr 22 19:40:50.132141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:50.132138 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerStarted","Data":"def6b7d3a67c92fa562cc6b22dd49661cc77e87d6f9b02d982fc3fff90611f6d"} Apr 22 19:40:52.142712 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.142680 2579 generic.go:358] "Generic (PLEG): container finished" podID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerID="8f0af972f51a4598163932c662038ecf33348dc8db84ad6187ca79c906a4e3cf" exitCode=0 Apr 22 19:40:52.143166 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.142747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerDied","Data":"8f0af972f51a4598163932c662038ecf33348dc8db84ad6187ca79c906a4e3cf"} Apr 22 19:40:52.201927 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.201904 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:40:52.314406 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314317 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config\") pod \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " Apr 22 19:40:52.314406 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314366 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls\") pod \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " Apr 22 19:40:52.314648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314429 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54bl7\" (UniqueName: \"kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7\") pod \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " Apr 22 19:40:52.314648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314448 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location\") pod \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\" (UID: \"146fb2f9-5fa0-4e53-94c2-b05c5908964f\") " Apr 22 19:40:52.314766 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314735 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "146fb2f9-5fa0-4e53-94c2-b05c5908964f" (UID: "146fb2f9-5fa0-4e53-94c2-b05c5908964f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:40:52.314766 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.314751 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-kube-rbac-proxy-sar-config") pod "146fb2f9-5fa0-4e53-94c2-b05c5908964f" (UID: "146fb2f9-5fa0-4e53-94c2-b05c5908964f"). InnerVolumeSpecName "isvc-xgboost-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:40:52.316598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.316571 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "146fb2f9-5fa0-4e53-94c2-b05c5908964f" (UID: "146fb2f9-5fa0-4e53-94c2-b05c5908964f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:40:52.316697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.316609 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7" (OuterVolumeSpecName: "kube-api-access-54bl7") pod "146fb2f9-5fa0-4e53-94c2-b05c5908964f" (UID: "146fb2f9-5fa0-4e53-94c2-b05c5908964f"). InnerVolumeSpecName "kube-api-access-54bl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:40:52.415197 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.415165 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/146fb2f9-5fa0-4e53-94c2-b05c5908964f-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:40:52.415197 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.415192 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-54bl7\" (UniqueName: \"kubernetes.io/projected/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kube-api-access-54bl7\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:40:52.415197 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.415202 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/146fb2f9-5fa0-4e53-94c2-b05c5908964f-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:40:52.415455 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:52.415211 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/146fb2f9-5fa0-4e53-94c2-b05c5908964f-isvc-xgboost-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:40:53.146955 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.146917 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" event={"ID":"146fb2f9-5fa0-4e53-94c2-b05c5908964f","Type":"ContainerDied","Data":"b4aa0286abe960544cd2bf0ba6b8fc5bb4cb09bef9e996ac439953e4e7249371"} Apr 22 19:40:53.146955 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.146953 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd" Apr 22 19:40:53.147498 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.146969 2579 scope.go:117] "RemoveContainer" containerID="cba5eaf8d30bd81701f447b590beef33120005a366f1eb06f46e42850e8dadcb" Apr 22 19:40:53.154396 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.154377 2579 scope.go:117] "RemoveContainer" containerID="8f0af972f51a4598163932c662038ecf33348dc8db84ad6187ca79c906a4e3cf" Apr 22 19:40:53.161182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.161164 2579 scope.go:117] "RemoveContainer" containerID="d3555b1a5283769b55539b2ddebe82d6c2d6d43cdfad6d4eb5bd6f7e4bf5ee1e" Apr 22 19:40:53.168843 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.168821 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:40:53.173136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:53.173114 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-8689c4cfcc-5jghd"] Apr 22 19:40:54.151019 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:54.150984 2579 generic.go:358] "Generic (PLEG): container finished" podID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerID="2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58" exitCode=0 Apr 22 19:40:54.151524 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:54.151059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerDied","Data":"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58"} Apr 22 19:40:54.544448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:54.544417 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" path="/var/lib/kubelet/pods/146fb2f9-5fa0-4e53-94c2-b05c5908964f/volumes" Apr 22 19:40:55.156059 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:55.156021 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerStarted","Data":"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506"} Apr 22 19:40:55.156059 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:55.156063 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerStarted","Data":"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683"} Apr 22 19:40:55.156529 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:55.156429 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:55.156529 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:55.156463 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:40:55.179386 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:40:55.179343 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" podStartSLOduration=7.179328705 podStartE2EDuration="7.179328705s" podCreationTimestamp="2026-04-22 19:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:40:55.177418575 +0000 UTC m=+3267.176992473" watchObservedRunningTime="2026-04-22 19:40:55.179328705 +0000 UTC m=+3267.178902650" Apr 22 19:41:01.164094 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:01.164068 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:41:31.168040 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:31.168006 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:41:38.746460 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.746418 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:41:38.746947 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.746890 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kserve-container" containerID="cri-o://db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683" gracePeriod=30 Apr 22 19:41:38.747024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.746941 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kube-rbac-proxy" containerID="cri-o://e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506" gracePeriod=30 Apr 22 19:41:38.832860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.832827 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:41:38.833095 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833084 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kube-rbac-proxy" Apr 22 19:41:38.833145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833097 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kube-rbac-proxy" Apr 22 19:41:38.833145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833114 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" Apr 22 19:41:38.833145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833119 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" Apr 22 19:41:38.833145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833126 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="storage-initializer" Apr 22 19:41:38.833145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833132 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="storage-initializer" Apr 22 19:41:38.833341 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833201 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kserve-container" Apr 22 19:41:38.833341 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.833208 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="146fb2f9-5fa0-4e53-94c2-b05c5908964f" containerName="kube-rbac-proxy" Apr 22 19:41:38.837624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.837601 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:38.840515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.840494 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-predictor-serving-cert\"" Apr 22 19:41:38.840615 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.840522 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\"" Apr 22 19:41:38.847492 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.847469 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:41:38.953828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.953788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqpg\" (UniqueName: \"kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:38.953828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.953830 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:38.954052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.953929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:38.954052 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:38.953965 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.054735 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.054637 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqpg\" (UniqueName: \"kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.054735 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.054677 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.054979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.054798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.054979 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.054839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.054979 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:41:39.054936 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-serving-cert: secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 22 19:41:39.055134 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:41:39.055013 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls podName:5e138105-ec8b-499d-8f52-7d1799081fef nodeName:}" failed. No retries permitted until 2026-04-22 19:41:39.554995158 +0000 UTC m=+3311.554569036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls") pod "xgboost-v2-mlserver-predictor-7799869d6f-v4chr" (UID: "5e138105-ec8b-499d-8f52-7d1799081fef") : secret "xgboost-v2-mlserver-predictor-serving-cert" not found Apr 22 19:41:39.055188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.055176 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.055444 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.055428 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.064356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.064335 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqpg\" (UniqueName: \"kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.280911 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.280871 2579 generic.go:358] "Generic (PLEG): container finished" podID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerID="e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506" exitCode=2 Apr 22 19:41:39.281082 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.280927 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerDied","Data":"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506"} Apr 22 19:41:39.559663 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.559622 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.562125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.562102 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") pod \"xgboost-v2-mlserver-predictor-7799869d6f-v4chr\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.748147 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.748099 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:39.871446 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:39.871422 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:41:39.873615 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:41:39.873588 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e138105_ec8b_499d_8f52_7d1799081fef.slice/crio-9ebea8291071bdf51c7e0132dbfb230d33b717280ed7fbbb7a57538423cec6c3 WatchSource:0}: Error finding container 9ebea8291071bdf51c7e0132dbfb230d33b717280ed7fbbb7a57538423cec6c3: Status 404 returned error can't find the container with id 9ebea8291071bdf51c7e0132dbfb230d33b717280ed7fbbb7a57538423cec6c3 Apr 22 19:41:40.287838 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:40.287795 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerStarted","Data":"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9"} Apr 22 19:41:40.287838 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:40.287833 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerStarted","Data":"9ebea8291071bdf51c7e0132dbfb230d33b717280ed7fbbb7a57538423cec6c3"} Apr 22 19:41:41.159823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:41.159779 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.56:8643/healthz\": dial tcp 10.132.0.56:8643: connect: connection refused" Apr 22 19:41:44.305891 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:44.305856 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e138105-ec8b-499d-8f52-7d1799081fef" containerID="4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9" exitCode=0 Apr 22 19:41:44.306284 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:44.305914 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerDied","Data":"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9"} Apr 22 19:41:45.287934 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.287911 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:41:45.312048 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.311887 2579 generic.go:358] "Generic (PLEG): container finished" podID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerID="db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683" exitCode=0 Apr 22 19:41:45.312048 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.312035 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerDied","Data":"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683"} Apr 22 19:41:45.312558 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.312091 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" event={"ID":"314f65e5-754a-48bb-bc81-2204e42c55e7","Type":"ContainerDied","Data":"def6b7d3a67c92fa562cc6b22dd49661cc77e87d6f9b02d982fc3fff90611f6d"} Apr 22 19:41:45.312558 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.312115 2579 scope.go:117] "RemoveContainer" containerID="e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506" Apr 22 19:41:45.312558 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.312165 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5" Apr 22 19:41:45.314602 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.314571 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerStarted","Data":"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33"} Apr 22 19:41:45.314746 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.314603 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerStarted","Data":"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744"} Apr 22 19:41:45.314914 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.314892 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:45.314914 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.314924 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:41:45.320899 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.320880 2579 scope.go:117] "RemoveContainer" containerID="db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683" Apr 22 19:41:45.327697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.327672 2579 scope.go:117] "RemoveContainer" containerID="2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58" Apr 22 19:41:45.335483 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.335464 2579 scope.go:117] "RemoveContainer" containerID="e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506" Apr 22 19:41:45.335790 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:41:45.335764 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506\": container with ID starting with e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506 not found: ID does not exist" containerID="e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506" Apr 22 19:41:45.335896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.335831 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506"} err="failed to get container status \"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506\": rpc error: code = NotFound desc = could not find container \"e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506\": container with ID starting with e5e221ccdba8ce7d2eaf85fa2cd23c7990b164787dc5e29925eb2a3f4bc67506 not found: ID does not exist" Apr 22 19:41:45.335896 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.335860 2579 scope.go:117] "RemoveContainer" containerID="db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683" Apr 22 19:41:45.336144 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:41:45.336125 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683\": container with ID starting with db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683 not found: ID does not exist" containerID="db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683" Apr 22 19:41:45.336215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.336152 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683"} err="failed to get container status \"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683\": rpc error: code = NotFound desc = could not find container \"db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683\": container with ID starting with db6bd198afea8e6d09ad22698319ff7e80a0b8605b951b2847f7cfc3f8884683 not found: ID does not exist" Apr 22 19:41:45.336215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.336169 2579 scope.go:117] "RemoveContainer" containerID="2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58" Apr 22 19:41:45.336516 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:41:45.336412 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58\": container with ID starting with 2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58 not found: ID does not exist" containerID="2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58" Apr 22 19:41:45.336516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.336438 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58"} err="failed to get container status \"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58\": rpc error: code = NotFound desc = could not find container \"2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58\": container with ID starting with 2069f4c4e0de50920bf68a535e39ea9a3f4ee12f06d51f1a984efdc165c79c58 not found: ID does not exist" Apr 22 19:41:45.337202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.337156 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" podStartSLOduration=7.337142377 podStartE2EDuration="7.337142377s" podCreationTimestamp="2026-04-22 19:41:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:41:45.33519773 +0000 UTC m=+3317.334771628" watchObservedRunningTime="2026-04-22 19:41:45.337142377 +0000 UTC m=+3317.336716277" Apr 22 19:41:45.404559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.404469 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location\") pod \"314f65e5-754a-48bb-bc81-2204e42c55e7\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " Apr 22 19:41:45.404559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.404506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") pod \"314f65e5-754a-48bb-bc81-2204e42c55e7\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " Apr 22 19:41:45.404559 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.404539 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2s7wg\" (UniqueName: \"kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg\") pod \"314f65e5-754a-48bb-bc81-2204e42c55e7\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " Apr 22 19:41:45.404854 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.404564 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"314f65e5-754a-48bb-bc81-2204e42c55e7\" (UID: \"314f65e5-754a-48bb-bc81-2204e42c55e7\") " Apr 22 19:41:45.404910 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.404849 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "314f65e5-754a-48bb-bc81-2204e42c55e7" (UID: "314f65e5-754a-48bb-bc81-2204e42c55e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:41:45.405086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.405054 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "314f65e5-754a-48bb-bc81-2204e42c55e7" (UID: "314f65e5-754a-48bb-bc81-2204e42c55e7"). InnerVolumeSpecName "isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:41:45.406669 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.406649 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg" (OuterVolumeSpecName: "kube-api-access-2s7wg") pod "314f65e5-754a-48bb-bc81-2204e42c55e7" (UID: "314f65e5-754a-48bb-bc81-2204e42c55e7"). InnerVolumeSpecName "kube-api-access-2s7wg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:41:45.406764 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.406706 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "314f65e5-754a-48bb-bc81-2204e42c55e7" (UID: "314f65e5-754a-48bb-bc81-2204e42c55e7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:41:45.505522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.505487 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2s7wg\" (UniqueName: \"kubernetes.io/projected/314f65e5-754a-48bb-bc81-2204e42c55e7-kube-api-access-2s7wg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:41:45.505522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.505516 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/314f65e5-754a-48bb-bc81-2204e42c55e7-isvc-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:41:45.505522 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.505529 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/314f65e5-754a-48bb-bc81-2204e42c55e7-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:41:45.505765 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.505538 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/314f65e5-754a-48bb-bc81-2204e42c55e7-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:41:45.636922 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.636888 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:41:45.641583 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:45.641556 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-67d4bc6646-6zln5"] Apr 22 19:41:46.544518 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:46.543989 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" path="/var/lib/kubelet/pods/314f65e5-754a-48bb-bc81-2204e42c55e7/volumes" Apr 22 19:41:51.324282 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:41:51.324232 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:42:21.327936 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:21.327909 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:42:28.914200 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.914166 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:42:28.914675 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.914607 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kserve-container" containerID="cri-o://c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744" gracePeriod=30 Apr 22 19:42:28.914783 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.914721 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kube-rbac-proxy" containerID="cri-o://d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33" gracePeriod=30 Apr 22 19:42:28.992175 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992132 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:42:28.992515 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992498 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kube-rbac-proxy" Apr 22 19:42:28.992601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992517 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kube-rbac-proxy" Apr 22 19:42:28.992601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992535 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="storage-initializer" Apr 22 19:42:28.992601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992543 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="storage-initializer" Apr 22 19:42:28.992601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992554 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kserve-container" Apr 22 19:42:28.992601 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992563 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kserve-container" Apr 22 19:42:28.992863 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992630 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kube-rbac-proxy" Apr 22 19:42:28.992863 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.992644 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="314f65e5-754a-48bb-bc81-2204e42c55e7" containerName="kserve-container" Apr 22 19:42:28.995823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.995803 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:28.998569 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.998541 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-predictor-serving-cert\"" Apr 22 19:42:28.998814 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:28.998795 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:42:29.004967 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.004945 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:42:29.041713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.041682 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.041841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.041718 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.041841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.041748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8gkt\" (UniqueName: \"kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.041945 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.041830 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142187 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f8gkt\" (UniqueName: \"kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142450 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142228 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142450 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142305 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142450 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142336 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142774 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.142931 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.142913 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.144684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.144665 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.150706 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.150683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8gkt\" (UniqueName: \"kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt\") pod \"isvc-xgboost-runtime-predictor-779db84d9-smwhz\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.306132 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.306094 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:29.427842 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.427783 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:42:29.430425 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:42:29.430393 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda530ce5b_d5c9_4a95_9b85_38d5ad54c9fd.slice/crio-3a088b1f79f497fa4d9223561eae1ac09538dc9d22371333e1ac6c491d59a31b WatchSource:0}: Error finding container 3a088b1f79f497fa4d9223561eae1ac09538dc9d22371333e1ac6c491d59a31b: Status 404 returned error can't find the container with id 3a088b1f79f497fa4d9223561eae1ac09538dc9d22371333e1ac6c491d59a31b Apr 22 19:42:29.435684 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.435655 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerStarted","Data":"3a088b1f79f497fa4d9223561eae1ac09538dc9d22371333e1ac6c491d59a31b"} Apr 22 19:42:29.440603 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.440580 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e138105-ec8b-499d-8f52-7d1799081fef" containerID="d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33" exitCode=2 Apr 22 19:42:29.440687 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:29.440629 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerDied","Data":"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33"} Apr 22 19:42:30.444590 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:30.444554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerStarted","Data":"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b"} Apr 22 19:42:31.320076 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:31.320030 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.57:8643/healthz\": dial tcp 10.132.0.57:8643: connect: connection refused" Apr 22 19:42:33.453895 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:33.453858 2579 generic.go:358] "Generic (PLEG): container finished" podID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerID="de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b" exitCode=0 Apr 22 19:42:33.454375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:33.453937 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerDied","Data":"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b"} Apr 22 19:42:34.458820 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.458785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerStarted","Data":"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7"} Apr 22 19:42:34.458820 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.458826 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerStarted","Data":"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1"} Apr 22 19:42:34.459357 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.459111 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:34.459357 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.459223 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:34.460427 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.460403 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:42:34.477885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:34.477849 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podStartSLOduration=6.477835747 podStartE2EDuration="6.477835747s" podCreationTimestamp="2026-04-22 19:42:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:42:34.477403327 +0000 UTC m=+3366.476977222" watchObservedRunningTime="2026-04-22 19:42:34.477835747 +0000 UTC m=+3366.477409646" Apr 22 19:42:35.452900 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.452879 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:42:35.462377 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462351 2579 generic.go:358] "Generic (PLEG): container finished" podID="5e138105-ec8b-499d-8f52-7d1799081fef" containerID="c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744" exitCode=0 Apr 22 19:42:35.462755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462388 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerDied","Data":"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744"} Apr 22 19:42:35.462755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462425 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" Apr 22 19:42:35.462755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462439 2579 scope.go:117] "RemoveContainer" containerID="d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33" Apr 22 19:42:35.462755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462429 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr" event={"ID":"5e138105-ec8b-499d-8f52-7d1799081fef","Type":"ContainerDied","Data":"9ebea8291071bdf51c7e0132dbfb230d33b717280ed7fbbb7a57538423cec6c3"} Apr 22 19:42:35.463034 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.462864 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:42:35.469632 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.469611 2579 scope.go:117] "RemoveContainer" containerID="c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744" Apr 22 19:42:35.477722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.477700 2579 scope.go:117] "RemoveContainer" containerID="4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9" Apr 22 19:42:35.486257 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.486239 2579 scope.go:117] "RemoveContainer" containerID="d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33" Apr 22 19:42:35.486556 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:42:35.486523 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33\": container with ID starting with d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33 not found: ID does not exist" containerID="d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33" Apr 22 19:42:35.486655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.486568 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33"} err="failed to get container status \"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33\": rpc error: code = NotFound desc = could not find container \"d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33\": container with ID starting with d01f0d840627e93811a00573f3bce261b62373f72012ca1a9a97e116d8214e33 not found: ID does not exist" Apr 22 19:42:35.486655 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.486593 2579 scope.go:117] "RemoveContainer" containerID="c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744" Apr 22 19:42:35.486973 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:42:35.486958 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744\": container with ID starting with c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744 not found: ID does not exist" containerID="c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744" Apr 22 19:42:35.487025 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.486990 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744"} err="failed to get container status \"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744\": rpc error: code = NotFound desc = could not find container \"c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744\": container with ID starting with c15e4c964461eedaad8a9dec5390259858d548e61d395ecfb0a3606905dc0744 not found: ID does not exist" Apr 22 19:42:35.487025 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.487008 2579 scope.go:117] "RemoveContainer" containerID="4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9" Apr 22 19:42:35.487301 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:42:35.487285 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9\": container with ID starting with 4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9 not found: ID does not exist" containerID="4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9" Apr 22 19:42:35.487370 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.487305 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9"} err="failed to get container status \"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9\": rpc error: code = NotFound desc = could not find container \"4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9\": container with ID starting with 4c68ea243c08b4cf1b0391f8a48b5e83ab5090b0d0385efa0c8698cf0e51f8c9 not found: ID does not exist" Apr 22 19:42:35.489526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489510 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location\") pod \"5e138105-ec8b-499d-8f52-7d1799081fef\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " Apr 22 19:42:35.489590 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489547 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqpg\" (UniqueName: \"kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg\") pod \"5e138105-ec8b-499d-8f52-7d1799081fef\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " Apr 22 19:42:35.489590 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489575 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") pod \"5e138105-ec8b-499d-8f52-7d1799081fef\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " Apr 22 19:42:35.489666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489605 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") pod \"5e138105-ec8b-499d-8f52-7d1799081fef\" (UID: \"5e138105-ec8b-499d-8f52-7d1799081fef\") " Apr 22 19:42:35.489983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489843 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e138105-ec8b-499d-8f52-7d1799081fef" (UID: "5e138105-ec8b-499d-8f52-7d1799081fef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:42:35.489983 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.489926 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "xgboost-v2-mlserver-kube-rbac-proxy-sar-config") pod "5e138105-ec8b-499d-8f52-7d1799081fef" (UID: "5e138105-ec8b-499d-8f52-7d1799081fef"). InnerVolumeSpecName "xgboost-v2-mlserver-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:42:35.491554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.491493 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg" (OuterVolumeSpecName: "kube-api-access-4lqpg") pod "5e138105-ec8b-499d-8f52-7d1799081fef" (UID: "5e138105-ec8b-499d-8f52-7d1799081fef"). InnerVolumeSpecName "kube-api-access-4lqpg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:42:35.491639 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.491573 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "5e138105-ec8b-499d-8f52-7d1799081fef" (UID: "5e138105-ec8b-499d-8f52-7d1799081fef"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:42:35.590762 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.590728 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e138105-ec8b-499d-8f52-7d1799081fef-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:42:35.590762 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.590754 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4lqpg\" (UniqueName: \"kubernetes.io/projected/5e138105-ec8b-499d-8f52-7d1799081fef-kube-api-access-4lqpg\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:42:35.591006 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.590769 2579 reconciler_common.go:299] "Volume detached for volume \"xgboost-v2-mlserver-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/5e138105-ec8b-499d-8f52-7d1799081fef-xgboost-v2-mlserver-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:42:35.591006 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.590784 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5e138105-ec8b-499d-8f52-7d1799081fef-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:42:35.788648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.788609 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:42:35.795535 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:35.795506 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-7799869d6f-v4chr"] Apr 22 19:42:36.543114 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:36.543058 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" path="/var/lib/kubelet/pods/5e138105-ec8b-499d-8f52-7d1799081fef/volumes" Apr 22 19:42:40.466840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:40.466813 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:42:40.467308 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:40.467285 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:42:50.467671 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:42:50.467633 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:00.467638 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:00.467593 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:10.467913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:10.467872 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:20.467941 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:20.467895 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:30.468312 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:30.468246 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:40.468166 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:40.468138 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:43:49.089281 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.089238 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:43:49.089718 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.089668 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" containerID="cri-o://8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1" gracePeriod=30 Apr 22 19:43:49.089838 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.089736 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kube-rbac-proxy" containerID="cri-o://1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7" gracePeriod=30 Apr 22 19:43:49.177760 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.177724 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:43:49.177994 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.177983 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kube-rbac-proxy" Apr 22 19:43:49.178039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.177996 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kube-rbac-proxy" Apr 22 19:43:49.178039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178008 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="storage-initializer" Apr 22 19:43:49.178039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178014 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="storage-initializer" Apr 22 19:43:49.178039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178022 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kserve-container" Apr 22 19:43:49.178039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178027 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kserve-container" Apr 22 19:43:49.178199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178079 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kube-rbac-proxy" Apr 22 19:43:49.178199 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.178088 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e138105-ec8b-499d-8f52-7d1799081fef" containerName="kserve-container" Apr 22 19:43:49.181228 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.181211 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.183982 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.183958 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\"" Apr 22 19:43:49.184103 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.183957 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-runtime-predictor-serving-cert\"" Apr 22 19:43:49.192762 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.192740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:43:49.246566 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.246537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.246696 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.246571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.246696 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.246591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.246696 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.246645 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs2zw\" (UniqueName: \"kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.347878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.347800 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qs2zw\" (UniqueName: \"kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.347878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.347862 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.348121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.347886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.348121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.347914 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.348121 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:43:49.348032 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-serving-cert: secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 22 19:43:49.348121 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:43:49.348102 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls podName:b96d11bb-2702-4948-9479-0f934238af05 nodeName:}" failed. No retries permitted until 2026-04-22 19:43:49.848080276 +0000 UTC m=+3441.847654157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls") pod "isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" (UID: "b96d11bb-2702-4948-9479-0f934238af05") : secret "isvc-xgboost-v2-runtime-predictor-serving-cert" not found Apr 22 19:43:49.348407 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.348388 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.348564 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.348549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.356960 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.356934 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs2zw\" (UniqueName: \"kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.671627 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.671546 2579 generic.go:358] "Generic (PLEG): container finished" podID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerID="1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7" exitCode=2 Apr 22 19:43:49.671772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.671621 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerDied","Data":"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7"} Apr 22 19:43:49.850652 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.850609 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:49.853099 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:49.853065 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") pod \"isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:50.090998 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.090959 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:50.217422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.217397 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:43:50.219313 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:43:50.219254 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb96d11bb_2702_4948_9479_0f934238af05.slice/crio-622a450a2d918aed4ad3b4d5755b47a120d299761daa51bf06a156942747fa03 WatchSource:0}: Error finding container 622a450a2d918aed4ad3b4d5755b47a120d299761daa51bf06a156942747fa03: Status 404 returned error can't find the container with id 622a450a2d918aed4ad3b4d5755b47a120d299761daa51bf06a156942747fa03 Apr 22 19:43:50.463713 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.463674 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.58:8643/healthz\": dial tcp 10.132.0.58:8643: connect: connection refused" Apr 22 19:43:50.467985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.467964 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 22 19:43:50.676005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.675968 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerStarted","Data":"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045"} Apr 22 19:43:50.676005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:50.676006 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerStarted","Data":"622a450a2d918aed4ad3b4d5755b47a120d299761daa51bf06a156942747fa03"} Apr 22 19:43:52.830678 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.830653 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:43:52.872411 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872380 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8gkt\" (UniqueName: \"kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt\") pod \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " Apr 22 19:43:52.872585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872431 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") pod \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " Apr 22 19:43:52.872585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872461 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls\") pod \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " Apr 22 19:43:52.872585 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872506 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location\") pod \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\" (UID: \"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd\") " Apr 22 19:43:52.872840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872810 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-runtime-kube-rbac-proxy-sar-config") pod "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" (UID: "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd"). InnerVolumeSpecName "isvc-xgboost-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:43:52.872840 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.872818 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" (UID: "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:43:52.874554 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.874530 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" (UID: "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:43:52.874620 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.874585 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt" (OuterVolumeSpecName: "kube-api-access-f8gkt") pod "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" (UID: "a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd"). InnerVolumeSpecName "kube-api-access-f8gkt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:43:52.973858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.973824 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:43:52.973858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.973853 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f8gkt\" (UniqueName: \"kubernetes.io/projected/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-kube-api-access-f8gkt\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:43:52.974039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.973867 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-isvc-xgboost-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:43:52.974039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:52.973881 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:43:53.686193 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.686156 2579 generic.go:358] "Generic (PLEG): container finished" podID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerID="8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1" exitCode=0 Apr 22 19:43:53.686442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.686239 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerDied","Data":"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1"} Apr 22 19:43:53.686442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.686291 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" event={"ID":"a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd","Type":"ContainerDied","Data":"3a088b1f79f497fa4d9223561eae1ac09538dc9d22371333e1ac6c491d59a31b"} Apr 22 19:43:53.686442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.686307 2579 scope.go:117] "RemoveContainer" containerID="1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7" Apr 22 19:43:53.686442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.686248 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz" Apr 22 19:43:53.694345 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.694327 2579 scope.go:117] "RemoveContainer" containerID="8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1" Apr 22 19:43:53.700962 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.700945 2579 scope.go:117] "RemoveContainer" containerID="de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b" Apr 22 19:43:53.708027 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.707862 2579 scope.go:117] "RemoveContainer" containerID="1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7" Apr 22 19:43:53.708133 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:43:53.708118 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7\": container with ID starting with 1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7 not found: ID does not exist" containerID="1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7" Apr 22 19:43:53.708182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708143 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7"} err="failed to get container status \"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7\": rpc error: code = NotFound desc = could not find container \"1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7\": container with ID starting with 1566e727ec80ed3ed217e5206fa0a9f1745c2c6077f2585148b21bf40d9b8cb7 not found: ID does not exist" Apr 22 19:43:53.708182 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708163 2579 scope.go:117] "RemoveContainer" containerID="8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1" Apr 22 19:43:53.708542 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:43:53.708518 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1\": container with ID starting with 8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1 not found: ID does not exist" containerID="8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1" Apr 22 19:43:53.708630 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708548 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1"} err="failed to get container status \"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1\": rpc error: code = NotFound desc = could not find container \"8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1\": container with ID starting with 8d9792d3a8ed2cf15e46bfde71c25380c35c432f7e9ceab1f9d9f272cde08ba1 not found: ID does not exist" Apr 22 19:43:53.708630 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708568 2579 scope.go:117] "RemoveContainer" containerID="de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b" Apr 22 19:43:53.708757 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708737 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:43:53.708850 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:43:53.708829 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b\": container with ID starting with de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b not found: ID does not exist" containerID="de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b" Apr 22 19:43:53.708911 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.708856 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b"} err="failed to get container status \"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b\": rpc error: code = NotFound desc = could not find container \"de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b\": container with ID starting with de63f27a190c8932c2997ea2d69b9362cbe5d1793da12fb65257ea0d0ddb929b not found: ID does not exist" Apr 22 19:43:53.711080 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:53.711056 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-779db84d9-smwhz"] Apr 22 19:43:54.543224 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:54.543194 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" path="/var/lib/kubelet/pods/a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd/volumes" Apr 22 19:43:54.690394 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:54.690362 2579 generic.go:358] "Generic (PLEG): container finished" podID="b96d11bb-2702-4948-9479-0f934238af05" containerID="fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045" exitCode=0 Apr 22 19:43:54.690394 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:54.690397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerDied","Data":"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045"} Apr 22 19:43:55.694961 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:55.694926 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerStarted","Data":"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155"} Apr 22 19:43:55.694961 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:55.694964 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerStarted","Data":"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1"} Apr 22 19:43:55.695419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:55.695184 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:55.695419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:55.695207 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:43:55.716442 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:43:55.716399 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podStartSLOduration=6.716382681 podStartE2EDuration="6.716382681s" podCreationTimestamp="2026-04-22 19:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:43:55.71437369 +0000 UTC m=+3447.713947615" watchObservedRunningTime="2026-04-22 19:43:55.716382681 +0000 UTC m=+3447.715956582" Apr 22 19:44:01.703083 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:01.703056 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:44:31.735024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:31.734964 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 22 19:44:41.707209 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:41.707181 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:44:49.392011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.391926 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392248 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kube-rbac-proxy" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392282 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kube-rbac-proxy" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392300 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="storage-initializer" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392305 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="storage-initializer" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392313 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392318 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392364 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kserve-container" Apr 22 19:44:49.392413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.392372 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a530ce5b-d5c9-4a95-9b85-38d5ad54c9fd" containerName="kube-rbac-proxy" Apr 22 19:44:49.395304 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.395286 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.397932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.397914 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-predictor-serving-cert\"" Apr 22 19:44:49.397932 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.397921 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-xgboost-v2-kube-rbac-proxy-sar-config\"" Apr 22 19:44:49.407868 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.407846 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:44:49.442491 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.442460 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:44:49.442876 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.442852 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" containerID="cri-o://bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1" gracePeriod=30 Apr 22 19:44:49.442952 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.442888 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" containerID="cri-o://6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155" gracePeriod=30 Apr 22 19:44:49.501594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.501560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhs9c\" (UniqueName: \"kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.501755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.501610 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.501755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.501637 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.501755 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.501660 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.602824 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.602793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhs9c\" (UniqueName: \"kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.603001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.602832 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.603001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.602855 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.603001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.602870 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.603001 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:44:49.602952 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-xgboost-v2-predictor-serving-cert: secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 22 19:44:49.603232 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:44:49.603052 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls podName:9c512295-2997-4c14-8c39-d51f997d11d8 nodeName:}" failed. No retries permitted until 2026-04-22 19:44:50.103014533 +0000 UTC m=+3502.102588420 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls") pod "isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" (UID: "9c512295-2997-4c14-8c39-d51f997d11d8") : secret "isvc-xgboost-v2-predictor-serving-cert" not found Apr 22 19:44:49.603232 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.603228 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.603536 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.603517 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.612377 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.612355 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhs9c\" (UniqueName: \"kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:49.847908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.847875 2579 generic.go:358] "Generic (PLEG): container finished" podID="b96d11bb-2702-4948-9479-0f934238af05" containerID="6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155" exitCode=2 Apr 22 19:44:49.848069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:49.847958 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerDied","Data":"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155"} Apr 22 19:44:50.105974 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.105879 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:50.108419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.108386 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") pod \"isvc-xgboost-v2-predictor-6fcdd6977c-d9g62\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:50.304732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.304697 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:50.426492 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.426458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:44:50.429294 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:44:50.429241 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c512295_2997_4c14_8c39_d51f997d11d8.slice/crio-d9389e4319aeb28be748d239ccf2d6d15906dc23c7b403e7c7a4a278d5430577 WatchSource:0}: Error finding container d9389e4319aeb28be748d239ccf2d6d15906dc23c7b403e7c7a4a278d5430577: Status 404 returned error can't find the container with id d9389e4319aeb28be748d239ccf2d6d15906dc23c7b403e7c7a4a278d5430577 Apr 22 19:44:50.431057 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.431039 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:44:50.853225 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.853189 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerStarted","Data":"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256"} Apr 22 19:44:50.853225 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:50.853227 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerStarted","Data":"d9389e4319aeb28be748d239ccf2d6d15906dc23c7b403e7c7a4a278d5430577"} Apr 22 19:44:51.697724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:51.697683 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 22 19:44:52.744414 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:52.744375 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.59:8080/v2/models/isvc-xgboost-v2-runtime/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 22 19:44:54.865280 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:54.865238 2579 generic.go:358] "Generic (PLEG): container finished" podID="9c512295-2997-4c14-8c39-d51f997d11d8" containerID="00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256" exitCode=0 Apr 22 19:44:54.865644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:54.865312 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerDied","Data":"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256"} Apr 22 19:44:55.869878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.869844 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerStarted","Data":"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0"} Apr 22 19:44:55.869878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.869880 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerStarted","Data":"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949"} Apr 22 19:44:55.870337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.870148 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:55.870337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.870257 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:44:55.871408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.871389 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:44:55.898762 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:55.898727 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podStartSLOduration=6.89871706 podStartE2EDuration="6.89871706s" podCreationTimestamp="2026-04-22 19:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:44:55.89730655 +0000 UTC m=+3507.896880449" watchObservedRunningTime="2026-04-22 19:44:55.89871706 +0000 UTC m=+3507.898290956" Apr 22 19:44:56.698648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.698604 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.59:8643/healthz\": dial tcp 10.132.0.59:8643: connect: connection refused" Apr 22 19:44:56.873594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.873562 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:44:56.874405 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.874381 2579 generic.go:358] "Generic (PLEG): container finished" podID="b96d11bb-2702-4948-9479-0f934238af05" containerID="bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1" exitCode=0 Apr 22 19:44:56.874516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.874449 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerDied","Data":"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1"} Apr 22 19:44:56.874516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.874484 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" event={"ID":"b96d11bb-2702-4948-9479-0f934238af05","Type":"ContainerDied","Data":"622a450a2d918aed4ad3b4d5755b47a120d299761daa51bf06a156942747fa03"} Apr 22 19:44:56.874516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.874500 2579 scope.go:117] "RemoveContainer" containerID="6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155" Apr 22 19:44:56.875001 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.874968 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:44:56.881633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.881617 2579 scope.go:117] "RemoveContainer" containerID="bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1" Apr 22 19:44:56.888196 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.888180 2579 scope.go:117] "RemoveContainer" containerID="fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045" Apr 22 19:44:56.894454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.894439 2579 scope.go:117] "RemoveContainer" containerID="6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155" Apr 22 19:44:56.894685 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:44:56.894667 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155\": container with ID starting with 6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155 not found: ID does not exist" containerID="6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155" Apr 22 19:44:56.894743 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.894692 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155"} err="failed to get container status \"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155\": rpc error: code = NotFound desc = could not find container \"6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155\": container with ID starting with 6e3a4a8adfdef8c29ed9c63dad744508918854c89cb2c8657045ef6142f80155 not found: ID does not exist" Apr 22 19:44:56.894743 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.894711 2579 scope.go:117] "RemoveContainer" containerID="bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1" Apr 22 19:44:56.894941 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:44:56.894922 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1\": container with ID starting with bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1 not found: ID does not exist" containerID="bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1" Apr 22 19:44:56.894986 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.894946 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1"} err="failed to get container status \"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1\": rpc error: code = NotFound desc = could not find container \"bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1\": container with ID starting with bafd1b1b3a765149bddbfdb914797270fc38be2d437bfebbaa4e15822e4803b1 not found: ID does not exist" Apr 22 19:44:56.894986 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.894961 2579 scope.go:117] "RemoveContainer" containerID="fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045" Apr 22 19:44:56.895180 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:44:56.895165 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045\": container with ID starting with fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045 not found: ID does not exist" containerID="fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045" Apr 22 19:44:56.895224 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:56.895182 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045"} err="failed to get container status \"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045\": rpc error: code = NotFound desc = could not find container \"fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045\": container with ID starting with fc28ac871f03fef8863b52e84c746bd40255ff5980060c514a20e83613ed2045 not found: ID does not exist" Apr 22 19:44:57.060689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.060601 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") pod \"b96d11bb-2702-4948-9479-0f934238af05\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " Apr 22 19:44:57.060689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.060680 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location\") pod \"b96d11bb-2702-4948-9479-0f934238af05\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " Apr 22 19:44:57.060912 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.060702 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") pod \"b96d11bb-2702-4948-9479-0f934238af05\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " Apr 22 19:44:57.060912 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.060721 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs2zw\" (UniqueName: \"kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw\") pod \"b96d11bb-2702-4948-9479-0f934238af05\" (UID: \"b96d11bb-2702-4948-9479-0f934238af05\") " Apr 22 19:44:57.061093 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.061057 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b96d11bb-2702-4948-9479-0f934238af05" (UID: "b96d11bb-2702-4948-9479-0f934238af05"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:44:57.061178 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.061070 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config") pod "b96d11bb-2702-4948-9479-0f934238af05" (UID: "b96d11bb-2702-4948-9479-0f934238af05"). InnerVolumeSpecName "isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:44:57.062841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.062821 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "b96d11bb-2702-4948-9479-0f934238af05" (UID: "b96d11bb-2702-4948-9479-0f934238af05"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:44:57.062941 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.062924 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw" (OuterVolumeSpecName: "kube-api-access-qs2zw") pod "b96d11bb-2702-4948-9479-0f934238af05" (UID: "b96d11bb-2702-4948-9479-0f934238af05"). InnerVolumeSpecName "kube-api-access-qs2zw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:44:57.161516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.161479 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b96d11bb-2702-4948-9479-0f934238af05-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:44:57.161516 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.161512 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/b96d11bb-2702-4948-9479-0f934238af05-isvc-xgboost-v2-runtime-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:44:57.161724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.161526 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qs2zw\" (UniqueName: \"kubernetes.io/projected/b96d11bb-2702-4948-9479-0f934238af05-kube-api-access-qs2zw\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:44:57.161724 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.161539 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/b96d11bb-2702-4948-9479-0f934238af05-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:44:57.877855 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.877824 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q" Apr 22 19:44:57.900829 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.900797 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:44:57.905685 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:57.905663 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-6dc5954dc-7q89q"] Apr 22 19:44:58.543176 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:44:58.543132 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b96d11bb-2702-4948-9479-0f934238af05" path="/var/lib/kubelet/pods/b96d11bb-2702-4948-9479-0f934238af05/volumes" Apr 22 19:45:01.879550 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:01.879521 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:45:01.880069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:01.880044 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:45:11.880828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:11.880784 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:45:21.880038 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:21.879998 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:45:31.880125 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:31.880031 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:45:41.880042 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:41.880004 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:45:51.880295 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:45:51.880235 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:46:01.880916 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:01.880886 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:46:09.493521 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.493490 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:46:09.493959 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.493895 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" containerID="cri-o://b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949" gracePeriod=30 Apr 22 19:46:09.494048 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.493961 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kube-rbac-proxy" containerID="cri-o://0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0" gracePeriod=30 Apr 22 19:46:09.561085 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561051 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:46:09.561375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561360 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" Apr 22 19:46:09.561466 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561378 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" Apr 22 19:46:09.561466 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561394 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" Apr 22 19:46:09.561466 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561403 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" Apr 22 19:46:09.561466 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561424 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="storage-initializer" Apr 22 19:46:09.561466 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561433 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="storage-initializer" Apr 22 19:46:09.561722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561511 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kserve-container" Apr 22 19:46:09.561722 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.561526 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b96d11bb-2702-4948-9479-0f934238af05" containerName="kube-rbac-proxy" Apr 22 19:46:09.564772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.564744 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.567657 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.567638 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-kube-rbac-proxy-sar-config\"" Apr 22 19:46:09.567657 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.567653 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-predictor-serving-cert\"" Apr 22 19:46:09.568054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.568041 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 22 19:46:09.574427 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.574406 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:46:09.711425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.711396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.711574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.711431 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.711574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.711457 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxh4n\" (UniqueName: \"kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.711574 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.711540 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.812795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.812700 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.812795 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.812756 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxh4n\" (UniqueName: \"kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.813023 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.812833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.813023 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.812860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.813339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.813318 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.813520 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.813504 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.815201 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.815185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.822248 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.822225 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxh4n\" (UniqueName: \"kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n\") pod \"isvc-sklearn-s3-predictor-88457d696-74h8b\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.875013 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.874982 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:09.991889 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:09.991865 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:46:09.994073 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:46:09.994048 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod430dffda_2b1b_41c3_a61b_307bb2b42e1e.slice/crio-16369a94b09c626e6c4160bb19e265d96f1c9979578e21c3e88280d0eebbceff WatchSource:0}: Error finding container 16369a94b09c626e6c4160bb19e265d96f1c9979578e21c3e88280d0eebbceff: Status 404 returned error can't find the container with id 16369a94b09c626e6c4160bb19e265d96f1c9979578e21c3e88280d0eebbceff Apr 22 19:46:10.069213 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:10.069156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerStarted","Data":"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2"} Apr 22 19:46:10.069213 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:10.069191 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerStarted","Data":"16369a94b09c626e6c4160bb19e265d96f1c9979578e21c3e88280d0eebbceff"} Apr 22 19:46:10.071119 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:10.071098 2579 generic.go:358] "Generic (PLEG): container finished" podID="9c512295-2997-4c14-8c39-d51f997d11d8" containerID="0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0" exitCode=2 Apr 22 19:46:10.071218 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:10.071150 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerDied","Data":"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0"} Apr 22 19:46:11.075372 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:11.075339 2579 generic.go:358] "Generic (PLEG): container finished" podID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerID="e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2" exitCode=0 Apr 22 19:46:11.075744 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:11.075400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerDied","Data":"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2"} Apr 22 19:46:11.875145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:11.875109 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.60:8643/healthz\": dial tcp 10.132.0.60:8643: connect: connection refused" Apr 22 19:46:11.880558 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:11.880534 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 22 19:46:12.080431 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:12.080400 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerStarted","Data":"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc"} Apr 22 19:46:12.080431 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:12.080433 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerStarted","Data":"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15"} Apr 22 19:46:12.080841 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:12.080517 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:12.101594 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:12.101531 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podStartSLOduration=3.101516951 podStartE2EDuration="3.101516951s" podCreationTimestamp="2026-04-22 19:46:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:46:12.100526079 +0000 UTC m=+3584.100099997" watchObservedRunningTime="2026-04-22 19:46:12.101516951 +0000 UTC m=+3584.101090845" Apr 22 19:46:13.083175 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.083150 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:13.084346 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.084321 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:13.239557 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.239532 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:46:13.340526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340433 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") pod \"9c512295-2997-4c14-8c39-d51f997d11d8\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " Apr 22 19:46:13.340526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340480 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhs9c\" (UniqueName: \"kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c\") pod \"9c512295-2997-4c14-8c39-d51f997d11d8\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " Apr 22 19:46:13.340526 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340497 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location\") pod \"9c512295-2997-4c14-8c39-d51f997d11d8\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " Apr 22 19:46:13.340823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340542 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") pod \"9c512295-2997-4c14-8c39-d51f997d11d8\" (UID: \"9c512295-2997-4c14-8c39-d51f997d11d8\") " Apr 22 19:46:13.340823 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340775 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9c512295-2997-4c14-8c39-d51f997d11d8" (UID: "9c512295-2997-4c14-8c39-d51f997d11d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:46:13.340922 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.340845 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-xgboost-v2-kube-rbac-proxy-sar-config") pod "9c512295-2997-4c14-8c39-d51f997d11d8" (UID: "9c512295-2997-4c14-8c39-d51f997d11d8"). InnerVolumeSpecName "isvc-xgboost-v2-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:46:13.343420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.343382 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9c512295-2997-4c14-8c39-d51f997d11d8" (UID: "9c512295-2997-4c14-8c39-d51f997d11d8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:46:13.348084 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.348058 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c" (OuterVolumeSpecName: "kube-api-access-zhs9c") pod "9c512295-2997-4c14-8c39-d51f997d11d8" (UID: "9c512295-2997-4c14-8c39-d51f997d11d8"). InnerVolumeSpecName "kube-api-access-zhs9c". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:46:13.441757 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.441715 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9c512295-2997-4c14-8c39-d51f997d11d8-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:46:13.441757 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.441751 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-xgboost-v2-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9c512295-2997-4c14-8c39-d51f997d11d8-isvc-xgboost-v2-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:46:13.441952 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.441766 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zhs9c\" (UniqueName: \"kubernetes.io/projected/9c512295-2997-4c14-8c39-d51f997d11d8-kube-api-access-zhs9c\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:46:13.441952 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:13.441781 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9c512295-2997-4c14-8c39-d51f997d11d8-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:46:14.087772 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.087734 2579 generic.go:358] "Generic (PLEG): container finished" podID="9c512295-2997-4c14-8c39-d51f997d11d8" containerID="b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949" exitCode=0 Apr 22 19:46:14.088223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.087815 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerDied","Data":"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949"} Apr 22 19:46:14.088223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.087856 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" event={"ID":"9c512295-2997-4c14-8c39-d51f997d11d8","Type":"ContainerDied","Data":"d9389e4319aeb28be748d239ccf2d6d15906dc23c7b403e7c7a4a278d5430577"} Apr 22 19:46:14.088223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.087853 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62" Apr 22 19:46:14.088223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.087871 2579 scope.go:117] "RemoveContainer" containerID="0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0" Apr 22 19:46:14.088716 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.088685 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:14.095538 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.095508 2579 scope.go:117] "RemoveContainer" containerID="b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949" Apr 22 19:46:14.102364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.102348 2579 scope.go:117] "RemoveContainer" containerID="00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256" Apr 22 19:46:14.108786 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.108770 2579 scope.go:117] "RemoveContainer" containerID="0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0" Apr 22 19:46:14.109031 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:46:14.109014 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0\": container with ID starting with 0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0 not found: ID does not exist" containerID="0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0" Apr 22 19:46:14.109084 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.109038 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0"} err="failed to get container status \"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0\": rpc error: code = NotFound desc = could not find container \"0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0\": container with ID starting with 0a9b811ece9a75386ee9d37ce9212ee051cb192796ef5a452f2836840e826dd0 not found: ID does not exist" Apr 22 19:46:14.109084 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.109055 2579 scope.go:117] "RemoveContainer" containerID="b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949" Apr 22 19:46:14.109256 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:46:14.109242 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949\": container with ID starting with b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949 not found: ID does not exist" containerID="b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949" Apr 22 19:46:14.109256 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.109273 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949"} err="failed to get container status \"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949\": rpc error: code = NotFound desc = could not find container \"b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949\": container with ID starting with b858ffdadd290827fe4fcaf09f90173c7b31af3b056edfb9da7aeed4c1a4f949 not found: ID does not exist" Apr 22 19:46:14.109409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.109285 2579 scope.go:117] "RemoveContainer" containerID="00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256" Apr 22 19:46:14.109558 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:46:14.109535 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256\": container with ID starting with 00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256 not found: ID does not exist" containerID="00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256" Apr 22 19:46:14.109631 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.109565 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256"} err="failed to get container status \"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256\": rpc error: code = NotFound desc = could not find container \"00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256\": container with ID starting with 00950553697be2215b91eb6b5f0743e417a8552a1b20e5f78135a7cddf2ac256 not found: ID does not exist" Apr 22 19:46:14.111662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.111643 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:46:14.115809 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.115789 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-6fcdd6977c-d9g62"] Apr 22 19:46:14.542985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:14.542950 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" path="/var/lib/kubelet/pods/9c512295-2997-4c14-8c39-d51f997d11d8/volumes" Apr 22 19:46:19.092232 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:19.092201 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:46:19.092718 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:19.092694 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:29.092894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:29.092850 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:39.092706 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:39.092660 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:49.093180 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:49.093144 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:46:59.093596 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:46:59.093551 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:47:09.093633 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:09.093594 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 22 19:47:19.093828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.093799 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:47:19.672413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.672382 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:47:19.672694 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.672668 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" containerID="cri-o://925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15" gracePeriod=30 Apr 22 19:47:19.672784 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.672691 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kube-rbac-proxy" containerID="cri-o://d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc" gracePeriod=30 Apr 22 19:47:19.811347 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811316 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:47:19.811600 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811586 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kube-rbac-proxy" Apr 22 19:47:19.811644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811603 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kube-rbac-proxy" Apr 22 19:47:19.811644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811616 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="storage-initializer" Apr 22 19:47:19.811644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811621 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="storage-initializer" Apr 22 19:47:19.811644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811628 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" Apr 22 19:47:19.811644 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811634 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" Apr 22 19:47:19.811800 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811692 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kube-rbac-proxy" Apr 22 19:47:19.811800 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.811702 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c512295-2997-4c14-8c39-d51f997d11d8" containerName="kserve-container" Apr 22 19:47:19.814697 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.814683 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.817903 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.817881 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-predictor-serving-cert\"" Apr 22 19:47:19.818005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.817988 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:47:19.818204 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.818186 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\"" Apr 22 19:47:19.829454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.829435 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:47:19.863374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.863350 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.863476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.863387 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.863476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.863459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.863564 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.863526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbptk\" (UniqueName: \"kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.863564 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.863551 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.964671 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.964638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbptk\" (UniqueName: \"kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.964671 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.964671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.964878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.964694 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.964878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.964723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.964878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.964780 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.965107 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.965088 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.965619 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.965594 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.965690 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.965602 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.967069 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.967045 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:19.973860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:19.973838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbptk\" (UniqueName: \"kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:20.124252 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:20.124217 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:20.244319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:20.244217 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:47:20.247329 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:47:20.247303 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d80e1ef_880a_485f_93e9_7e08533fb9a8.slice/crio-2389e08661cbe74c0dad2575eec7be6c0cba0115af7983cca4990dbd7f6e5e63 WatchSource:0}: Error finding container 2389e08661cbe74c0dad2575eec7be6c0cba0115af7983cca4990dbd7f6e5e63: Status 404 returned error can't find the container with id 2389e08661cbe74c0dad2575eec7be6c0cba0115af7983cca4990dbd7f6e5e63 Apr 22 19:47:20.265660 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:20.265635 2579 generic.go:358] "Generic (PLEG): container finished" podID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerID="d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc" exitCode=2 Apr 22 19:47:20.265765 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:20.265708 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerDied","Data":"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc"} Apr 22 19:47:20.266810 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:20.266785 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerStarted","Data":"2389e08661cbe74c0dad2575eec7be6c0cba0115af7983cca4990dbd7f6e5e63"} Apr 22 19:47:21.270980 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:21.270933 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerID="fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe" exitCode=0 Apr 22 19:47:21.271406 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:21.271001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerDied","Data":"fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe"} Apr 22 19:47:22.275393 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:22.275352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerStarted","Data":"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9"} Apr 22 19:47:22.275393 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:22.275397 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerStarted","Data":"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2"} Apr 22 19:47:22.275828 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:22.275500 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:22.297042 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:22.296993 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podStartSLOduration=3.296978088 podStartE2EDuration="3.296978088s" podCreationTimestamp="2026-04-22 19:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:47:22.295013579 +0000 UTC m=+3654.294587478" watchObservedRunningTime="2026-04-22 19:47:22.296978088 +0000 UTC m=+3654.296551985" Apr 22 19:47:23.278524 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:23.278494 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:23.279573 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:23.279544 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:47:24.117885 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.117861 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:47:24.197159 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197079 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls\") pod \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " Apr 22 19:47:24.197159 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197135 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") pod \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " Apr 22 19:47:24.197397 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197161 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxh4n\" (UniqueName: \"kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n\") pod \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " Apr 22 19:47:24.197397 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197205 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location\") pod \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\" (UID: \"430dffda-2b1b-41c3-a61b-307bb2b42e1e\") " Apr 22 19:47:24.197557 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197535 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "430dffda-2b1b-41c3-a61b-307bb2b42e1e" (UID: "430dffda-2b1b-41c3-a61b-307bb2b42e1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:47:24.197613 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.197553 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-kube-rbac-proxy-sar-config") pod "430dffda-2b1b-41c3-a61b-307bb2b42e1e" (UID: "430dffda-2b1b-41c3-a61b-307bb2b42e1e"). InnerVolumeSpecName "isvc-sklearn-s3-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:47:24.199172 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.199151 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n" (OuterVolumeSpecName: "kube-api-access-fxh4n") pod "430dffda-2b1b-41c3-a61b-307bb2b42e1e" (UID: "430dffda-2b1b-41c3-a61b-307bb2b42e1e"). InnerVolumeSpecName "kube-api-access-fxh4n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:47:24.199229 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.199160 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "430dffda-2b1b-41c3-a61b-307bb2b42e1e" (UID: "430dffda-2b1b-41c3-a61b-307bb2b42e1e"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:47:24.282649 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.282619 2579 generic.go:358] "Generic (PLEG): container finished" podID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerID="925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15" exitCode=0 Apr 22 19:47:24.283021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.282696 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" Apr 22 19:47:24.283021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.282697 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerDied","Data":"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15"} Apr 22 19:47:24.283021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.282730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" event={"ID":"430dffda-2b1b-41c3-a61b-307bb2b42e1e","Type":"ContainerDied","Data":"16369a94b09c626e6c4160bb19e265d96f1c9979578e21c3e88280d0eebbceff"} Apr 22 19:47:24.283021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.282746 2579 scope.go:117] "RemoveContainer" containerID="d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc" Apr 22 19:47:24.283417 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.283388 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:47:24.291200 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.291174 2579 scope.go:117] "RemoveContainer" containerID="925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15" Apr 22 19:47:24.297985 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.297965 2579 scope.go:117] "RemoveContainer" containerID="e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2" Apr 22 19:47:24.298295 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.298276 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/430dffda-2b1b-41c3-a61b-307bb2b42e1e-isvc-sklearn-s3-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:47:24.298413 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.298391 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fxh4n\" (UniqueName: \"kubernetes.io/projected/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kube-api-access-fxh4n\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:47:24.298493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.298425 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/430dffda-2b1b-41c3-a61b-307bb2b42e1e-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:47:24.298493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.298442 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/430dffda-2b1b-41c3-a61b-307bb2b42e1e-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:47:24.304461 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.304446 2579 scope.go:117] "RemoveContainer" containerID="d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc" Apr 22 19:47:24.304696 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:47:24.304671 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc\": container with ID starting with d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc not found: ID does not exist" containerID="d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc" Apr 22 19:47:24.304858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.304700 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc"} err="failed to get container status \"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc\": rpc error: code = NotFound desc = could not find container \"d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc\": container with ID starting with d77981f02455e08c15b0b6fe21d283d6ccdf30f9ce7a411e091c5d67416de8bc not found: ID does not exist" Apr 22 19:47:24.304858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.304716 2579 scope.go:117] "RemoveContainer" containerID="925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15" Apr 22 19:47:24.305115 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:47:24.305091 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15\": container with ID starting with 925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15 not found: ID does not exist" containerID="925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15" Apr 22 19:47:24.305388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.305123 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15"} err="failed to get container status \"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15\": rpc error: code = NotFound desc = could not find container \"925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15\": container with ID starting with 925d9ec21323bccee181dd281f94186a27923a55c75057da9ad2c608c65cbc15 not found: ID does not exist" Apr 22 19:47:24.305388 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.305143 2579 scope.go:117] "RemoveContainer" containerID="e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2" Apr 22 19:47:24.305537 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:47:24.305432 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2\": container with ID starting with e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2 not found: ID does not exist" containerID="e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2" Apr 22 19:47:24.305537 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.305463 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2"} err="failed to get container status \"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2\": rpc error: code = NotFound desc = could not find container \"e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2\": container with ID starting with e07c7e31a4300a31883b63cbc28efaf2327fcf3dfaf6a55c53e77b30c4f5d5e2 not found: ID does not exist" Apr 22 19:47:24.307121 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.307101 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:47:24.311656 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.311635 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b"] Apr 22 19:47:24.543578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:24.543497 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" path="/var/lib/kubelet/pods/430dffda-2b1b-41c3-a61b-307bb2b42e1e/volumes" Apr 22 19:47:25.089468 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:25.089420 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-88457d696-74h8b" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.61:8643/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 22 19:47:29.287977 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:29.287948 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:47:29.288506 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:29.288475 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:47:39.288509 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:39.288472 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:47:49.289077 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:49.288997 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:47:59.289233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:47:59.289189 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:48:09.289374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:09.289334 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:48:19.289195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:19.289160 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.62:8080: connect: connection refused" Apr 22 19:48:29.289223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:29.289196 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:48:29.866221 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:29.866182 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:48:29.866539 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:29.866496 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" containerID="cri-o://b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2" gracePeriod=30 Apr 22 19:48:29.866609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:29.866576 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kube-rbac-proxy" containerID="cri-o://9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9" gracePeriod=30 Apr 22 19:48:30.467778 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.467746 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerID="9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9" exitCode=2 Apr 22 19:48:30.468136 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.467821 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerDied","Data":"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9"} Apr 22 19:48:30.961424 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961390 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:30.961685 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961672 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" Apr 22 19:48:30.961685 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961686 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961694 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="storage-initializer" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961700 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="storage-initializer" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961707 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kube-rbac-proxy" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961713 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kube-rbac-proxy" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961766 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kserve-container" Apr 22 19:48:30.961781 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.961774 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="430dffda-2b1b-41c3-a61b-307bb2b42e1e" containerName="kube-rbac-proxy" Apr 22 19:48:30.964784 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.964761 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:30.969615 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.969586 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\"" Apr 22 19:48:30.969615 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.969612 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-global-fail-predictor-serving-cert\"" Apr 22 19:48:30.977537 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:30.977514 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:31.102007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.101975 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.102007 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.102019 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngvv9\" (UniqueName: \"kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.102237 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.102041 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.102237 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.102112 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.203400 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.203365 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.203587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.203409 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngvv9\" (UniqueName: \"kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.203587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.203432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.203587 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.203567 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.203808 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.203783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.204054 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.204033 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.205913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.205890 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.213279 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.213215 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngvv9\" (UniqueName: \"kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.274867 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.274834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:31.397835 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.397811 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:31.399789 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:48:31.399762 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10c31d86_d168_4de7_9eab_427b1ce65d44.slice/crio-21f3924045d054e2a4b9aa7840051e3465acbe29f93e951e58a3019a3ef0c3d9 WatchSource:0}: Error finding container 21f3924045d054e2a4b9aa7840051e3465acbe29f93e951e58a3019a3ef0c3d9: Status 404 returned error can't find the container with id 21f3924045d054e2a4b9aa7840051e3465acbe29f93e951e58a3019a3ef0c3d9 Apr 22 19:48:31.472481 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.472403 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerStarted","Data":"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08"} Apr 22 19:48:31.472481 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:31.472445 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerStarted","Data":"21f3924045d054e2a4b9aa7840051e3465acbe29f93e951e58a3019a3ef0c3d9"} Apr 22 19:48:34.000897 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.000874 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:48:34.128236 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128145 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location\") pod \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " Apr 22 19:48:34.128236 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128193 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbptk\" (UniqueName: \"kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk\") pod \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " Apr 22 19:48:34.128499 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128242 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") pod \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " Apr 22 19:48:34.128499 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128282 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert\") pod \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " Apr 22 19:48:34.128499 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128362 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls\") pod \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\" (UID: \"9d80e1ef-880a-485f-93e9-7e08533fb9a8\") " Apr 22 19:48:34.128678 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128586 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9d80e1ef-880a-485f-93e9-7e08533fb9a8" (UID: "9d80e1ef-880a-485f-93e9-7e08533fb9a8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:34.128739 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128671 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config") pod "9d80e1ef-880a-485f-93e9-7e08533fb9a8" (UID: "9d80e1ef-880a-485f-93e9-7e08533fb9a8"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:48:34.128739 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.128724 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "9d80e1ef-880a-485f-93e9-7e08533fb9a8" (UID: "9d80e1ef-880a-485f-93e9-7e08533fb9a8"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:48:34.130425 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.130403 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "9d80e1ef-880a-485f-93e9-7e08533fb9a8" (UID: "9d80e1ef-880a-485f-93e9-7e08533fb9a8"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:34.130517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.130499 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk" (OuterVolumeSpecName: "kube-api-access-sbptk") pod "9d80e1ef-880a-485f-93e9-7e08533fb9a8" (UID: "9d80e1ef-880a-485f-93e9-7e08533fb9a8"). InnerVolumeSpecName "kube-api-access-sbptk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:34.229798 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.229758 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9d80e1ef-880a-485f-93e9-7e08533fb9a8-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:34.229798 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.229790 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:34.229798 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.229804 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbptk\" (UniqueName: \"kubernetes.io/projected/9d80e1ef-880a-485f-93e9-7e08533fb9a8-kube-api-access-sbptk\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:34.230047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.229818 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-isvc-sklearn-s3-tls-global-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:34.230047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.229833 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/9d80e1ef-880a-485f-93e9-7e08533fb9a8-cabundle-cert\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:34.481721 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.481684 2579 generic.go:358] "Generic (PLEG): container finished" podID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerID="b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2" exitCode=0 Apr 22 19:48:34.481898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.481766 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerDied","Data":"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2"} Apr 22 19:48:34.481898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.481802 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" Apr 22 19:48:34.481898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.481809 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28" event={"ID":"9d80e1ef-880a-485f-93e9-7e08533fb9a8","Type":"ContainerDied","Data":"2389e08661cbe74c0dad2575eec7be6c0cba0115af7983cca4990dbd7f6e5e63"} Apr 22 19:48:34.481898 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.481826 2579 scope.go:117] "RemoveContainer" containerID="9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9" Apr 22 19:48:34.490120 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.490101 2579 scope.go:117] "RemoveContainer" containerID="b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2" Apr 22 19:48:34.496974 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.496951 2579 scope.go:117] "RemoveContainer" containerID="fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe" Apr 22 19:48:34.503332 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.503316 2579 scope.go:117] "RemoveContainer" containerID="9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9" Apr 22 19:48:34.503573 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:34.503550 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9\": container with ID starting with 9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9 not found: ID does not exist" containerID="9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9" Apr 22 19:48:34.503635 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.503580 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9"} err="failed to get container status \"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9\": rpc error: code = NotFound desc = could not find container \"9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9\": container with ID starting with 9b9176f7f3096334071fa5e528d3d83d8985a40fb2c458e6b8252047720daab9 not found: ID does not exist" Apr 22 19:48:34.503635 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.503601 2579 scope.go:117] "RemoveContainer" containerID="b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2" Apr 22 19:48:34.506310 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.506099 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:48:34.506310 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:34.506183 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2\": container with ID starting with b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2 not found: ID does not exist" containerID="b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2" Apr 22 19:48:34.506310 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.506207 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2"} err="failed to get container status \"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2\": rpc error: code = NotFound desc = could not find container \"b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2\": container with ID starting with b8e53a3b894468242f47c024b5e17433dead8bcf840638a2ed0cfe8ae448b8d2 not found: ID does not exist" Apr 22 19:48:34.506310 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.506227 2579 scope.go:117] "RemoveContainer" containerID="fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe" Apr 22 19:48:34.506660 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:34.506604 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe\": container with ID starting with fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe not found: ID does not exist" containerID="fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe" Apr 22 19:48:34.506660 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.506631 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe"} err="failed to get container status \"fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe\": rpc error: code = NotFound desc = could not find container \"fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe\": container with ID starting with fbad1305650e54c69a459fb657893ea57807233d00276211144ffe8e374716fe not found: ID does not exist" Apr 22 19:48:34.512274 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.512244 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5488974f76-8fb28"] Apr 22 19:48:34.542662 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:34.542635 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" path="/var/lib/kubelet/pods/9d80e1ef-880a-485f-93e9-7e08533fb9a8/volumes" Apr 22 19:48:36.488929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:36.488900 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/0.log" Apr 22 19:48:36.489328 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:36.488939 2579 generic.go:358] "Generic (PLEG): container finished" podID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerID="70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08" exitCode=1 Apr 22 19:48:36.489328 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:36.489021 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerDied","Data":"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08"} Apr 22 19:48:37.492829 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:37.492800 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/0.log" Apr 22 19:48:37.493198 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:37.492903 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerStarted","Data":"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650"} Apr 22 19:48:41.027034 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:41.026991 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:41.027645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:41.027314 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" containerID="cri-o://710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650" gracePeriod=30 Apr 22 19:48:42.060894 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.060866 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/1.log" Apr 22 19:48:42.061346 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061330 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/0.log" Apr 22 19:48:42.061412 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061376 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:48:42.061412 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061397 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:42.061698 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061687 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" Apr 22 19:48:42.061738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061701 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" Apr 22 19:48:42.061738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061713 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kube-rbac-proxy" Apr 22 19:48:42.061738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061718 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kube-rbac-proxy" Apr 22 19:48:42.061738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061731 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="storage-initializer" Apr 22 19:48:42.061738 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061736 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061743 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061748 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061754 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061759 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061800 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061808 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kube-rbac-proxy" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061814 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9d80e1ef-880a-485f-93e9-7e08533fb9a8" containerName="kserve-container" Apr 22 19:48:42.061908 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.061889 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerName="storage-initializer" Apr 22 19:48:42.064676 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.064660 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.067463 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.067445 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:48:42.067575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.067483 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\"" Apr 22 19:48:42.067575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.067483 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert\"" Apr 22 19:48:42.075440 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.075421 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:48:42.086727 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.086708 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location\") pod \"10c31d86-d168-4de7-9eab-427b1ce65d44\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " Apr 22 19:48:42.086843 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.086747 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls\") pod \"10c31d86-d168-4de7-9eab-427b1ce65d44\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " Apr 22 19:48:42.086843 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.086775 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvv9\" (UniqueName: \"kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9\") pod \"10c31d86-d168-4de7-9eab-427b1ce65d44\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " Apr 22 19:48:42.087092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.086884 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") pod \"10c31d86-d168-4de7-9eab-427b1ce65d44\" (UID: \"10c31d86-d168-4de7-9eab-427b1ce65d44\") " Apr 22 19:48:42.087092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.086974 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "10c31d86-d168-4de7-9eab-427b1ce65d44" (UID: "10c31d86-d168-4de7-9eab-427b1ce65d44"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:48:42.087092 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.087249 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087100 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.087249 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087135 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4j78\" (UniqueName: \"kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.087249 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087213 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.087423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087254 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.087423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087310 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/10c31d86-d168-4de7-9eab-427b1ce65d44-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:42.087423 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.087322 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config") pod "10c31d86-d168-4de7-9eab-427b1ce65d44" (UID: "10c31d86-d168-4de7-9eab-427b1ce65d44"). InnerVolumeSpecName "isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:48:42.089338 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.089313 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "10c31d86-d168-4de7-9eab-427b1ce65d44" (UID: "10c31d86-d168-4de7-9eab-427b1ce65d44"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:48:42.089338 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.089321 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9" (OuterVolumeSpecName: "kube-api-access-ngvv9") pod "10c31d86-d168-4de7-9eab-427b1ce65d44" (UID: "10c31d86-d168-4de7-9eab-427b1ce65d44"). InnerVolumeSpecName "kube-api-access-ngvv9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:48:42.188135 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188103 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188172 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188211 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h4j78\" (UniqueName: \"kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188244 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/10c31d86-d168-4de7-9eab-427b1ce65d44-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188295 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ngvv9\" (UniqueName: \"kubernetes.io/projected/10c31d86-d168-4de7-9eab-427b1ce65d44-kube-api-access-ngvv9\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:42.188337 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188309 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/10c31d86-d168-4de7-9eab-427b1ce65d44-isvc-sklearn-s3-tls-global-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:48:42.188678 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:42.188358 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 22 19:48:42.188678 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:42.188448 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls podName:a35f4392-c9fd-4407-86e5-17dc19cf7642 nodeName:}" failed. No retries permitted until 2026-04-22 19:48:42.688427778 +0000 UTC m=+3734.688001659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls") pod "isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642") : secret "isvc-sklearn-s3-tls-custom-pass-predictor-serving-cert" not found Apr 22 19:48:42.188753 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188862 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.188899 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.188861 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.199420 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.199394 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4j78\" (UniqueName: \"kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.507981 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.507903 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/1.log" Apr 22 19:48:42.508313 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508294 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr_10c31d86-d168-4de7-9eab-427b1ce65d44/storage-initializer/0.log" Apr 22 19:48:42.508417 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508333 2579 generic.go:358] "Generic (PLEG): container finished" podID="10c31d86-d168-4de7-9eab-427b1ce65d44" containerID="710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650" exitCode=1 Apr 22 19:48:42.508417 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508413 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" Apr 22 19:48:42.508514 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508411 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerDied","Data":"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650"} Apr 22 19:48:42.508575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508527 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr" event={"ID":"10c31d86-d168-4de7-9eab-427b1ce65d44","Type":"ContainerDied","Data":"21f3924045d054e2a4b9aa7840051e3465acbe29f93e951e58a3019a3ef0c3d9"} Apr 22 19:48:42.508575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.508551 2579 scope.go:117] "RemoveContainer" containerID="710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650" Apr 22 19:48:42.516440 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.516422 2579 scope.go:117] "RemoveContainer" containerID="70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08" Apr 22 19:48:42.523017 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.523004 2579 scope.go:117] "RemoveContainer" containerID="710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650" Apr 22 19:48:42.523254 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:42.523238 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650\": container with ID starting with 710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650 not found: ID does not exist" containerID="710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650" Apr 22 19:48:42.523356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.523281 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650"} err="failed to get container status \"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650\": rpc error: code = NotFound desc = could not find container \"710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650\": container with ID starting with 710725cdb7214eb4d916b4aad7b5478fff19645333d947565348c7643aa18650 not found: ID does not exist" Apr 22 19:48:42.523356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.523298 2579 scope.go:117] "RemoveContainer" containerID="70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08" Apr 22 19:48:42.523557 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:48:42.523542 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08\": container with ID starting with 70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08 not found: ID does not exist" containerID="70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08" Apr 22 19:48:42.523598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.523562 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08"} err="failed to get container status \"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08\": rpc error: code = NotFound desc = could not find container \"70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08\": container with ID starting with 70f4f267282cdfcaff663716094ff2020a8f89e8aacb53f92528a0f43c716b08 not found: ID does not exist" Apr 22 19:48:42.565392 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.565365 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:42.581550 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.581520 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-54884788bb-l7psr"] Apr 22 19:48:42.691346 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.691304 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.693712 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.693682 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:42.973915 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:42.973880 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:43.091648 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:43.091623 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:48:43.093550 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:48:43.093525 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda35f4392_c9fd_4407_86e5_17dc19cf7642.slice/crio-c7d2490e6f5859542cca1d70e922e65a206ee4161a46ef54d02689f1b70ce0af WatchSource:0}: Error finding container c7d2490e6f5859542cca1d70e922e65a206ee4161a46ef54d02689f1b70ce0af: Status 404 returned error can't find the container with id c7d2490e6f5859542cca1d70e922e65a206ee4161a46ef54d02689f1b70ce0af Apr 22 19:48:43.513493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:43.513455 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerStarted","Data":"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1"} Apr 22 19:48:43.513493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:43.513496 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerStarted","Data":"c7d2490e6f5859542cca1d70e922e65a206ee4161a46ef54d02689f1b70ce0af"} Apr 22 19:48:44.518573 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:44.518533 2579 generic.go:358] "Generic (PLEG): container finished" podID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerID="19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1" exitCode=0 Apr 22 19:48:44.518929 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:44.518572 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerDied","Data":"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1"} Apr 22 19:48:44.549546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:44.549505 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10c31d86-d168-4de7-9eab-427b1ce65d44" path="/var/lib/kubelet/pods/10c31d86-d168-4de7-9eab-427b1ce65d44/volumes" Apr 22 19:48:45.523195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:45.523156 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerStarted","Data":"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264"} Apr 22 19:48:45.523195 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:45.523199 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerStarted","Data":"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5"} Apr 22 19:48:45.523646 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:45.523309 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:45.548331 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:45.548291 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podStartSLOduration=3.548246278 podStartE2EDuration="3.548246278s" podCreationTimestamp="2026-04-22 19:48:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:48:45.546693252 +0000 UTC m=+3737.546267148" watchObservedRunningTime="2026-04-22 19:48:45.548246278 +0000 UTC m=+3737.547820175" Apr 22 19:48:46.525800 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:46.525769 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:46.526737 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:46.526708 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:48:47.528664 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:47.528625 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:48:52.533047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:52.533019 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:48:52.533598 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:48:52.533572 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:02.533645 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:02.533600 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:12.533628 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:12.533584 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:22.534118 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:22.534080 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:32.533970 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:32.533930 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:42.533588 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:42.533542 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:49:52.534215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:49:52.534179 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:50:02.177448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.177413 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:50:02.177836 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.177736 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" containerID="cri-o://1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5" gracePeriod=30 Apr 22 19:50:02.177913 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.177798 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kube-rbac-proxy" containerID="cri-o://f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264" gracePeriod=30 Apr 22 19:50:02.529800 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.529759 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.64:8643/healthz\": dial tcp 10.132.0.64:8643: connect: connection refused" Apr 22 19:50:02.534073 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.534036 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 22 19:50:02.733319 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.733248 2579 generic.go:358] "Generic (PLEG): container finished" podID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerID="f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264" exitCode=2 Apr 22 19:50:02.733508 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:02.733332 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerDied","Data":"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264"} Apr 22 19:50:03.165356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.165321 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:03.168763 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.168746 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.175317 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.175295 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert\"" Apr 22 19:50:03.175317 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.175328 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\"" Apr 22 19:50:03.186080 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.186054 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:03.198411 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.198380 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrg4p\" (UniqueName: \"kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.198411 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.198413 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.198572 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.198445 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.198572 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.198526 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.299206 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299168 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.299408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299222 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrg4p\" (UniqueName: \"kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.299408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.299408 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:03.299345 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 22 19:50:03.299408 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:03.299408 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls podName:6ba0c826-c8ba-43e6-89d3-d6ec227de0b6 nodeName:}" failed. No retries permitted until 2026-04-22 19:50:03.799389185 +0000 UTC m=+3815.798963066 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls") pod "isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" (UID: "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6") : secret "isvc-sklearn-s3-tls-custom-fail-predictor-serving-cert" not found Apr 22 19:50:03.299612 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.299714 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299691 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.300011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.299993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.313228 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.313193 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrg4p\" (UniqueName: \"kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.803166 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.803112 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:03.805628 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:03.805607 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:04.078625 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:04.078526 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:04.205579 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:04.205418 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:04.208466 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:50:04.208436 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ba0c826_c8ba_43e6_89d3_d6ec227de0b6.slice/crio-7bdc541c38bcf371acf2514099b9097ab0642dd32fcfb09e94eb043d147da27e WatchSource:0}: Error finding container 7bdc541c38bcf371acf2514099b9097ab0642dd32fcfb09e94eb043d147da27e: Status 404 returned error can't find the container with id 7bdc541c38bcf371acf2514099b9097ab0642dd32fcfb09e94eb043d147da27e Apr 22 19:50:04.210187 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:04.210173 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 19:50:04.740494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:04.740460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerStarted","Data":"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02"} Apr 22 19:50:04.740494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:04.740499 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerStarted","Data":"7bdc541c38bcf371acf2514099b9097ab0642dd32fcfb09e94eb043d147da27e"} Apr 22 19:50:06.620696 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.620674 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:50:06.726799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.726720 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location\") pod \"a35f4392-c9fd-4407-86e5-17dc19cf7642\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " Apr 22 19:50:06.726799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.726775 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert\") pod \"a35f4392-c9fd-4407-86e5-17dc19cf7642\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " Apr 22 19:50:06.727003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.726806 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") pod \"a35f4392-c9fd-4407-86e5-17dc19cf7642\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " Apr 22 19:50:06.727003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.726859 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4j78\" (UniqueName: \"kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78\") pod \"a35f4392-c9fd-4407-86e5-17dc19cf7642\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " Apr 22 19:50:06.727003 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.726913 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") pod \"a35f4392-c9fd-4407-86e5-17dc19cf7642\" (UID: \"a35f4392-c9fd-4407-86e5-17dc19cf7642\") " Apr 22 19:50:06.727173 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.727130 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a35f4392-c9fd-4407-86e5-17dc19cf7642" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:06.727280 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.727232 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a35f4392-c9fd-4407-86e5-17dc19cf7642" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:50:06.727342 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.727325 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config") pod "a35f4392-c9fd-4407-86e5-17dc19cf7642" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:50:06.729005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.728986 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "a35f4392-c9fd-4407-86e5-17dc19cf7642" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:50:06.729081 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.729029 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78" (OuterVolumeSpecName: "kube-api-access-h4j78") pod "a35f4392-c9fd-4407-86e5-17dc19cf7642" (UID: "a35f4392-c9fd-4407-86e5-17dc19cf7642"). InnerVolumeSpecName "kube-api-access-h4j78". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:50:06.749335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.749299 2579 generic.go:358] "Generic (PLEG): container finished" podID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerID="1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5" exitCode=0 Apr 22 19:50:06.749454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.749344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerDied","Data":"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5"} Apr 22 19:50:06.749454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.749380 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" event={"ID":"a35f4392-c9fd-4407-86e5-17dc19cf7642","Type":"ContainerDied","Data":"c7d2490e6f5859542cca1d70e922e65a206ee4161a46ef54d02689f1b70ce0af"} Apr 22 19:50:06.749454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.749401 2579 scope.go:117] "RemoveContainer" containerID="f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264" Apr 22 19:50:06.749454 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.749416 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r" Apr 22 19:50:06.757155 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.757134 2579 scope.go:117] "RemoveContainer" containerID="1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5" Apr 22 19:50:06.764241 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.764227 2579 scope.go:117] "RemoveContainer" containerID="19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1" Apr 22 19:50:06.770576 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.770558 2579 scope.go:117] "RemoveContainer" containerID="f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264" Apr 22 19:50:06.770802 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:06.770782 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264\": container with ID starting with f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264 not found: ID does not exist" containerID="f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264" Apr 22 19:50:06.770858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.770810 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264"} err="failed to get container status \"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264\": rpc error: code = NotFound desc = could not find container \"f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264\": container with ID starting with f32b13ff3b5001a99d6423d7d4596064aa3894e696ba17b10cf65d658e865264 not found: ID does not exist" Apr 22 19:50:06.770858 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.770827 2579 scope.go:117] "RemoveContainer" containerID="1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5" Apr 22 19:50:06.771122 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:06.771091 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5\": container with ID starting with 1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5 not found: ID does not exist" containerID="1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5" Apr 22 19:50:06.771215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.771122 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5"} err="failed to get container status \"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5\": rpc error: code = NotFound desc = could not find container \"1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5\": container with ID starting with 1bb4d69401087cff073031562f97736d8e08f36b437ce905d60fb5ba1a7ca7d5 not found: ID does not exist" Apr 22 19:50:06.771215 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.771143 2579 scope.go:117] "RemoveContainer" containerID="19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1" Apr 22 19:50:06.771485 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:06.771462 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1\": container with ID starting with 19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1 not found: ID does not exist" containerID="19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1" Apr 22 19:50:06.771550 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.771496 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1"} err="failed to get container status \"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1\": rpc error: code = NotFound desc = could not find container \"19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1\": container with ID starting with 19dad820b668f2fbc82109459fd42b15b1fc7ad6e3a9b951051254e7473b04e1 not found: ID does not exist" Apr 22 19:50:06.772900 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.772880 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:50:06.780065 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.780046 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-7877ccc664-xrk7r"] Apr 22 19:50:06.828208 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.828176 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h4j78\" (UniqueName: \"kubernetes.io/projected/a35f4392-c9fd-4407-86e5-17dc19cf7642-kube-api-access-h4j78\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:06.828208 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.828207 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-isvc-sklearn-s3-tls-custom-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:06.828356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.828219 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a35f4392-c9fd-4407-86e5-17dc19cf7642-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:06.828356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.828228 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a35f4392-c9fd-4407-86e5-17dc19cf7642-cabundle-cert\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:06.828356 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:06.828237 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a35f4392-c9fd-4407-86e5-17dc19cf7642-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:08.542560 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:08.542521 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" path="/var/lib/kubelet/pods/a35f4392-c9fd-4407-86e5-17dc19cf7642/volumes" Apr 22 19:50:09.758759 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:09.758735 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/0.log" Apr 22 19:50:09.759145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:09.758770 2579 generic.go:358] "Generic (PLEG): container finished" podID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerID="5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02" exitCode=1 Apr 22 19:50:09.759145 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:09.758832 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerDied","Data":"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02"} Apr 22 19:50:10.763151 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:10.763121 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/0.log" Apr 22 19:50:10.763652 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:10.763228 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerStarted","Data":"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6"} Apr 22 19:50:13.254119 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:13.254085 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:13.254525 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:13.254393 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" containerID="cri-o://1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6" gracePeriod=30 Apr 22 19:50:14.253924 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.253891 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:50:14.254167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254155 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254169 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254182 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kube-rbac-proxy" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254188 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kube-rbac-proxy" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254203 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="storage-initializer" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254208 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="storage-initializer" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254249 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kube-rbac-proxy" Apr 22 19:50:14.254476 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.254257 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="a35f4392-c9fd-4407-86e5-17dc19cf7642" containerName="kserve-container" Apr 22 19:50:14.256769 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.256753 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.261202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.261177 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-predictor-serving-cert\"" Apr 22 19:50:14.261202 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.261183 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 22 19:50:14.261391 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.261285 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\"" Apr 22 19:50:14.279654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.279625 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:50:14.282788 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.282763 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv2sb\" (UniqueName: \"kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.282906 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.282799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.282906 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.282895 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.283011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.282930 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.283011 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.282961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384032 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.383996 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kv2sb\" (UniqueName: \"kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384173 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384173 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384095 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384173 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384116 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384173 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384637 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384610 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.384891 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.385051 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.384954 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.386682 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.386663 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.396298 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.396280 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kv2sb\" (UniqueName: \"kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.567223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.567194 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:14.597033 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.597009 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/1.log" Apr 22 19:50:14.597406 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.597392 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/0.log" Apr 22 19:50:14.597494 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.597458 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:14.686514 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686479 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") pod \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " Apr 22 19:50:14.686654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686531 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") pod \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " Apr 22 19:50:14.686654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686562 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location\") pod \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " Apr 22 19:50:14.686654 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686626 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrg4p\" (UniqueName: \"kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p\") pod \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\" (UID: \"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6\") " Apr 22 19:50:14.686883 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686863 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" (UID: "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:50:14.686970 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.686942 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config") pod "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" (UID: "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6"). InnerVolumeSpecName "isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:50:14.688552 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.688533 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p" (OuterVolumeSpecName: "kube-api-access-xrg4p") pod "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" (UID: "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6"). InnerVolumeSpecName "kube-api-access-xrg4p". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:50:14.688624 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.688579 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" (UID: "6ba0c826-c8ba-43e6-89d3-d6ec227de0b6"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:50:14.699977 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.699956 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:50:14.703148 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:50:14.703129 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8bf4f6c_405a_4028_81a2_3fc61d071bc1.slice/crio-ac8337ef83fe3b9c3628ff90dcf5416561c36ff0adcd0d9c2de48de294e32abe WatchSource:0}: Error finding container ac8337ef83fe3b9c3628ff90dcf5416561c36ff0adcd0d9c2de48de294e32abe: Status 404 returned error can't find the container with id ac8337ef83fe3b9c3628ff90dcf5416561c36ff0adcd0d9c2de48de294e32abe Apr 22 19:50:14.774553 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.774504 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/1.log" Apr 22 19:50:14.774883 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.774868 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh_6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/storage-initializer/0.log" Apr 22 19:50:14.774948 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.774904 2579 generic.go:358] "Generic (PLEG): container finished" podID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerID="1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6" exitCode=1 Apr 22 19:50:14.775002 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.774974 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerDied","Data":"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6"} Apr 22 19:50:14.775060 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.775000 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" Apr 22 19:50:14.775060 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.775016 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh" event={"ID":"6ba0c826-c8ba-43e6-89d3-d6ec227de0b6","Type":"ContainerDied","Data":"7bdc541c38bcf371acf2514099b9097ab0642dd32fcfb09e94eb043d147da27e"} Apr 22 19:50:14.775060 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.775035 2579 scope.go:117] "RemoveContainer" containerID="1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6" Apr 22 19:50:14.776422 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.776393 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerStarted","Data":"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453"} Apr 22 19:50:14.776505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.776430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerStarted","Data":"ac8337ef83fe3b9c3628ff90dcf5416561c36ff0adcd0d9c2de48de294e32abe"} Apr 22 19:50:14.785045 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.785026 2579 scope.go:117] "RemoveContainer" containerID="5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02" Apr 22 19:50:14.787226 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.787206 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:14.787335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.787231 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-isvc-sklearn-s3-tls-custom-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:14.787335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.787248 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:14.787335 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.787282 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrg4p\" (UniqueName: \"kubernetes.io/projected/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6-kube-api-access-xrg4p\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:50:14.792058 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.792040 2579 scope.go:117] "RemoveContainer" containerID="1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6" Apr 22 19:50:14.792344 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:14.792321 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6\": container with ID starting with 1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6 not found: ID does not exist" containerID="1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6" Apr 22 19:50:14.792409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.792352 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6"} err="failed to get container status \"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6\": rpc error: code = NotFound desc = could not find container \"1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6\": container with ID starting with 1cd664b60d08a865b7e97e65f757991465e0adc68b3c8cc03e2c46b89f2dbfa6 not found: ID does not exist" Apr 22 19:50:14.792409 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.792369 2579 scope.go:117] "RemoveContainer" containerID="5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02" Apr 22 19:50:14.792590 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:50:14.792576 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02\": container with ID starting with 5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02 not found: ID does not exist" containerID="5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02" Apr 22 19:50:14.792637 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.792593 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02"} err="failed to get container status \"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02\": rpc error: code = NotFound desc = could not find container \"5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02\": container with ID starting with 5b796991d59ab4b6433408c9ff8a7968c9c5494a9ee021ffdee759eb4a1bea02 not found: ID does not exist" Apr 22 19:50:14.838278 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.838240 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:14.843365 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:14.843342 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-7d65b5b7cd-fw9vh"] Apr 22 19:50:15.780419 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:15.780383 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerID="2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453" exitCode=0 Apr 22 19:50:15.780776 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:15.780469 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerDied","Data":"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453"} Apr 22 19:50:16.544919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:16.544889 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" path="/var/lib/kubelet/pods/6ba0c826-c8ba-43e6-89d3-d6ec227de0b6/volumes" Apr 22 19:50:16.784839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:16.784805 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerStarted","Data":"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd"} Apr 22 19:50:16.785188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:16.784843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerStarted","Data":"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc"} Apr 22 19:50:16.785188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:16.784997 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:16.813183 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:16.813100 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podStartSLOduration=2.813088762 podStartE2EDuration="2.813088762s" podCreationTimestamp="2026-04-22 19:50:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:50:16.811469154 +0000 UTC m=+3828.811043052" watchObservedRunningTime="2026-04-22 19:50:16.813088762 +0000 UTC m=+3828.812662658" Apr 22 19:50:17.787834 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:17.787804 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:17.789024 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:17.788997 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:50:18.790237 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:18.790200 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:50:23.794665 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:23.794636 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:50:23.795185 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:23.795154 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:50:33.796116 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:33.796076 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:50:43.795704 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:43.795662 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:50:53.795141 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:50:53.795095 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:51:03.795793 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:03.795752 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:51:13.796084 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:13.796046 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 22 19:51:23.796093 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:23.796059 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:51:24.303467 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:24.303418 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:51:24.303860 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:24.303819 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kube-rbac-proxy" containerID="cri-o://2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd" gracePeriod=30 Apr 22 19:51:24.304005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:24.303798 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" containerID="cri-o://4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc" gracePeriod=30 Apr 22 19:51:24.972682 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:24.972651 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerID="2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd" exitCode=2 Apr 22 19:51:24.973066 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:24.972716 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerDied","Data":"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd"} Apr 22 19:51:25.397490 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397407 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:25.397686 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397674 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.397732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397687 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.397732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397704 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.397732 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397709 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.397831 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397754 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.397831 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.397764 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6ba0c826-c8ba-43e6-89d3-d6ec227de0b6" containerName="storage-initializer" Apr 22 19:51:25.400746 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.400729 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.403375 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.403354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\"" Apr 22 19:51:25.403780 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.403762 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert\"" Apr 22 19:51:25.411685 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.411663 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:25.426727 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.426704 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.426839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.426764 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl2bv\" (UniqueName: \"kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.426839 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.426819 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.426919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.426846 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528059 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528018 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl2bv\" (UniqueName: \"kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528309 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528086 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528309 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528309 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528146 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528309 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:25.528239 2579 secret.go:189] Couldn't get secret kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert: secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 22 19:51:25.528568 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:25.528340 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls podName:0fbb0db2-55bd-40cd-a885-dba58c4df757 nodeName:}" failed. No retries permitted until 2026-04-22 19:51:26.02832134 +0000 UTC m=+3898.027895215 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls") pod "isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" (UID: "0fbb0db2-55bd-40cd-a885-dba58c4df757") : secret "isvc-sklearn-s3-tls-serving-fail-predictor-serving-cert" not found Apr 22 19:51:25.528634 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.528848 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.528829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:25.538039 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:25.538016 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl2bv\" (UniqueName: \"kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:26.032893 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.032858 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:26.035311 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.035286 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:26.311047 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.310947 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:26.437310 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.437285 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:26.439427 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:51:26.439395 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fbb0db2_55bd_40cd_a885_dba58c4df757.slice/crio-0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b WatchSource:0}: Error finding container 0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b: Status 404 returned error can't find the container with id 0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b Apr 22 19:51:26.979339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.979300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerStarted","Data":"bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc"} Apr 22 19:51:26.979339 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:26.979338 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerStarted","Data":"0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b"} Apr 22 19:51:28.790538 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.790502 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kube-rbac-proxy" probeResult="failure" output="Get \"https://10.132.0.66:8643/healthz\": dial tcp 10.132.0.66:8643: connect: connection refused" Apr 22 19:51:28.955708 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.955683 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:51:28.986692 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.986655 2579 generic.go:358] "Generic (PLEG): container finished" podID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerID="4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc" exitCode=0 Apr 22 19:51:28.986845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.986739 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" Apr 22 19:51:28.986845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.986747 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerDied","Data":"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc"} Apr 22 19:51:28.986845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.986798 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n" event={"ID":"e8bf4f6c-405a-4028-81a2-3fc61d071bc1","Type":"ContainerDied","Data":"ac8337ef83fe3b9c3628ff90dcf5416561c36ff0adcd0d9c2de48de294e32abe"} Apr 22 19:51:28.986845 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.986819 2579 scope.go:117] "RemoveContainer" containerID="2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd" Apr 22 19:51:28.994820 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:28.994798 2579 scope.go:117] "RemoveContainer" containerID="4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc" Apr 22 19:51:29.001939 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.001919 2579 scope.go:117] "RemoveContainer" containerID="2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453" Apr 22 19:51:29.008681 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.008666 2579 scope.go:117] "RemoveContainer" containerID="2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd" Apr 22 19:51:29.008920 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:29.008899 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd\": container with ID starting with 2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd not found: ID does not exist" containerID="2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd" Apr 22 19:51:29.008991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.008933 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd"} err="failed to get container status \"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd\": rpc error: code = NotFound desc = could not find container \"2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd\": container with ID starting with 2a83c8b9cfd4b026d5e29eadf5030c96c2b25c5a016e3f5d39b32b45ad265ecd not found: ID does not exist" Apr 22 19:51:29.008991 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.008959 2579 scope.go:117] "RemoveContainer" containerID="4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc" Apr 22 19:51:29.009189 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:29.009173 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc\": container with ID starting with 4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc not found: ID does not exist" containerID="4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc" Apr 22 19:51:29.009233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.009196 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc"} err="failed to get container status \"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc\": rpc error: code = NotFound desc = could not find container \"4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc\": container with ID starting with 4811ebe9cd92694c48cf7abaceb1c422fbd25d98a360b928deaceaae2b72e8fc not found: ID does not exist" Apr 22 19:51:29.009233 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.009211 2579 scope.go:117] "RemoveContainer" containerID="2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453" Apr 22 19:51:29.009460 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:29.009434 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453\": container with ID starting with 2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453 not found: ID does not exist" containerID="2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453" Apr 22 19:51:29.009523 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.009459 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453"} err="failed to get container status \"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453\": rpc error: code = NotFound desc = could not find container \"2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453\": container with ID starting with 2c290926e05533946d2d15a1caadb5d0ff9e28beb2b091e08537e16ac52ae453 not found: ID does not exist" Apr 22 19:51:29.054849 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.054810 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kv2sb\" (UniqueName: \"kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb\") pod \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " Apr 22 19:51:29.055086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.054873 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls\") pod \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " Apr 22 19:51:29.055086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.054921 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") pod \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " Apr 22 19:51:29.055086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.054947 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert\") pod \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " Apr 22 19:51:29.055086 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.054996 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location\") pod \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\" (UID: \"e8bf4f6c-405a-4028-81a2-3fc61d071bc1\") " Apr 22 19:51:29.055380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.055349 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config") pod "e8bf4f6c-405a-4028-81a2-3fc61d071bc1" (UID: "e8bf4f6c-405a-4028-81a2-3fc61d071bc1"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:51:29.055380 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.055369 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8bf4f6c-405a-4028-81a2-3fc61d071bc1" (UID: "e8bf4f6c-405a-4028-81a2-3fc61d071bc1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:51:29.055489 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.055383 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "e8bf4f6c-405a-4028-81a2-3fc61d071bc1" (UID: "e8bf4f6c-405a-4028-81a2-3fc61d071bc1"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:51:29.057156 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.057131 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "e8bf4f6c-405a-4028-81a2-3fc61d071bc1" (UID: "e8bf4f6c-405a-4028-81a2-3fc61d071bc1"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:51:29.057374 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.057135 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb" (OuterVolumeSpecName: "kube-api-access-kv2sb") pod "e8bf4f6c-405a-4028-81a2-3fc61d071bc1" (UID: "e8bf4f6c-405a-4028-81a2-3fc61d071bc1"). InnerVolumeSpecName "kube-api-access-kv2sb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:51:29.156323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.156253 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-isvc-sklearn-s3-tls-serving-pass-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:29.156323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.156316 2579 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-cabundle-cert\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:29.156323 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.156328 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:29.156578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.156339 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kv2sb\" (UniqueName: \"kubernetes.io/projected/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-kube-api-access-kv2sb\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:29.156578 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.156348 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e8bf4f6c-405a-4028-81a2-3fc61d071bc1-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:29.314525 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.314489 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:51:29.318505 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:29.318479 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-c86b5bbcf-h5v4n"] Apr 22 19:51:30.542905 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:30.542864 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" path="/var/lib/kubelet/pods/e8bf4f6c-405a-4028-81a2-3fc61d071bc1/volumes" Apr 22 19:51:30.995541 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:30.995514 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/0.log" Apr 22 19:51:30.995720 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:30.995551 2579 generic.go:358] "Generic (PLEG): container finished" podID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerID="bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc" exitCode=1 Apr 22 19:51:30.995720 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:30.995579 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerDied","Data":"bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc"} Apr 22 19:51:31.999448 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:31.999419 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/0.log" Apr 22 19:51:31.999820 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:31.999521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerStarted","Data":"8bcaa985821822879a4ac294ac2b41269bc04a9e7889e919489517c1f589c2dd"} Apr 22 19:51:34.006808 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.006776 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/1.log" Apr 22 19:51:34.007252 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.007187 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/0.log" Apr 22 19:51:34.007252 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.007219 2579 generic.go:358] "Generic (PLEG): container finished" podID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerID="8bcaa985821822879a4ac294ac2b41269bc04a9e7889e919489517c1f589c2dd" exitCode=1 Apr 22 19:51:34.007364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.007302 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerDied","Data":"8bcaa985821822879a4ac294ac2b41269bc04a9e7889e919489517c1f589c2dd"} Apr 22 19:51:34.007364 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.007341 2579 scope.go:117] "RemoveContainer" containerID="bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc" Apr 22 19:51:34.007718 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:34.007698 2579 scope.go:117] "RemoveContainer" containerID="bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc" Apr 22 19:51:34.020116 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:34.020082 2579 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_kserve-ci-e2e-test_0fbb0db2-55bd-40cd-a885-dba58c4df757_0 in pod sandbox 0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b from index: no such id: 'bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc'" containerID="bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc" Apr 22 19:51:34.020199 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:34.020139 2579 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_kserve-ci-e2e-test_0fbb0db2-55bd-40cd-a885-dba58c4df757_0 in pod sandbox 0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b from index: no such id: 'bc927189a2cf6b7eac2812f01cb40fd42f0fc748933a0ac1fe9a2e0e4b4134fc'; Skipping pod \"isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_kserve-ci-e2e-test(0fbb0db2-55bd-40cd-a885-dba58c4df757)\"" logger="UnhandledError" Apr 22 19:51:34.021534 ip-10-0-137-19 kubenswrapper[2579]: E0422 19:51:34.021512 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_kserve-ci-e2e-test(0fbb0db2-55bd-40cd-a885-dba58c4df757)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" Apr 22 19:51:35.010926 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.010897 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/1.log" Apr 22 19:51:35.382921 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.382825 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:35.506372 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.506346 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/1.log" Apr 22 19:51:35.506511 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.506413 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:35.608021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.607987 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location\") pod \"0fbb0db2-55bd-40cd-a885-dba58c4df757\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " Apr 22 19:51:35.608021 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.608033 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") pod \"0fbb0db2-55bd-40cd-a885-dba58c4df757\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " Apr 22 19:51:35.608343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.608064 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") pod \"0fbb0db2-55bd-40cd-a885-dba58c4df757\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " Apr 22 19:51:35.608343 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.608105 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl2bv\" (UniqueName: \"kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv\") pod \"0fbb0db2-55bd-40cd-a885-dba58c4df757\" (UID: \"0fbb0db2-55bd-40cd-a885-dba58c4df757\") " Apr 22 19:51:35.608482 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.608343 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fbb0db2-55bd-40cd-a885-dba58c4df757" (UID: "0fbb0db2-55bd-40cd-a885-dba58c4df757"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 19:51:35.608482 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.608443 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config" (OuterVolumeSpecName: "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config") pod "0fbb0db2-55bd-40cd-a885-dba58c4df757" (UID: "0fbb0db2-55bd-40cd-a885-dba58c4df757"). InnerVolumeSpecName "isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 19:51:35.610183 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.610162 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0fbb0db2-55bd-40cd-a885-dba58c4df757" (UID: "0fbb0db2-55bd-40cd-a885-dba58c4df757"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 19:51:35.610325 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.610305 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv" (OuterVolumeSpecName: "kube-api-access-fl2bv") pod "0fbb0db2-55bd-40cd-a885-dba58c4df757" (UID: "0fbb0db2-55bd-40cd-a885-dba58c4df757"). InnerVolumeSpecName "kube-api-access-fl2bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 19:51:35.709139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.709107 2579 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb0db2-55bd-40cd-a885-dba58c4df757-kserve-provision-location\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:35.709139 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.709135 2579 reconciler_common.go:299] "Volume detached for volume \"isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\" (UniqueName: \"kubernetes.io/configmap/0fbb0db2-55bd-40cd-a885-dba58c4df757-isvc-sklearn-s3-tls-serving-fail-kube-rbac-proxy-sar-config\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:35.709427 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.709145 2579 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0fbb0db2-55bd-40cd-a885-dba58c4df757-proxy-tls\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:35.709427 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:35.709159 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fl2bv\" (UniqueName: \"kubernetes.io/projected/0fbb0db2-55bd-40cd-a885-dba58c4df757-kube-api-access-fl2bv\") on node \"ip-10-0-137-19.ec2.internal\" DevicePath \"\"" Apr 22 19:51:36.014723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.014643 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9_0fbb0db2-55bd-40cd-a885-dba58c4df757/storage-initializer/1.log" Apr 22 19:51:36.014723 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.014707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" event={"ID":"0fbb0db2-55bd-40cd-a885-dba58c4df757","Type":"ContainerDied","Data":"0900b07e7549545c219a758ce09aab328ca18f50fd8c6555a9d96e41ec45f14b"} Apr 22 19:51:36.015167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.014737 2579 scope.go:117] "RemoveContainer" containerID="8bcaa985821822879a4ac294ac2b41269bc04a9e7889e919489517c1f589c2dd" Apr 22 19:51:36.015167 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.014759 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9" Apr 22 19:51:36.057435 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.057398 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:36.063634 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.063599 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-5bc5655965-hzgx9"] Apr 22 19:51:36.544015 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:51:36.543971 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" path="/var/lib/kubelet/pods/0fbb0db2-55bd-40cd-a885-dba58c4df757/volumes" Apr 22 19:52:06.156588 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156548 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tvdg/must-gather-wlgs2"] Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156806 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156817 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156829 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156835 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156843 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kube-rbac-proxy" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156849 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kube-rbac-proxy" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156861 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156868 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156905 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kserve-container" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156915 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156921 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8bf4f6c-405a-4028-81a2-3fc61d071bc1" containerName="kube-rbac-proxy" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156928 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156972 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.157088 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.156978 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbb0db2-55bd-40cd-a885-dba58c4df757" containerName="storage-initializer" Apr 22 19:52:06.159802 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.159784 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.162815 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.162783 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"kube-root-ca.crt\"" Apr 22 19:52:06.162958 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.162813 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8tvdg\"/\"default-dockercfg-ksxp9\"" Apr 22 19:52:06.162958 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.162926 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8tvdg\"/\"openshift-service-ca.crt\"" Apr 22 19:52:06.168030 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.168007 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/must-gather-wlgs2"] Apr 22 19:52:06.236969 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.236907 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf5kv\" (UniqueName: \"kubernetes.io/projected/7b005642-936c-4d56-8085-3d0a756d0fe4-kube-api-access-lf5kv\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.236969 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.236970 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b005642-936c-4d56-8085-3d0a756d0fe4-must-gather-output\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.338240 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.338197 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf5kv\" (UniqueName: \"kubernetes.io/projected/7b005642-936c-4d56-8085-3d0a756d0fe4-kube-api-access-lf5kv\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.338240 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.338238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b005642-936c-4d56-8085-3d0a756d0fe4-must-gather-output\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.338593 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.338578 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7b005642-936c-4d56-8085-3d0a756d0fe4-must-gather-output\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.346939 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.346900 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf5kv\" (UniqueName: \"kubernetes.io/projected/7b005642-936c-4d56-8085-3d0a756d0fe4-kube-api-access-lf5kv\") pod \"must-gather-wlgs2\" (UID: \"7b005642-936c-4d56-8085-3d0a756d0fe4\") " pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.470406 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.470364 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" Apr 22 19:52:06.593348 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:06.593190 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/must-gather-wlgs2"] Apr 22 19:52:06.595953 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:52:06.595906 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b005642_936c_4d56_8085_3d0a756d0fe4.slice/crio-6f2c915a78c6662ff1bfe75a5f2a75ed49d65a2ff39f7ca0aa871afb4538f956 WatchSource:0}: Error finding container 6f2c915a78c6662ff1bfe75a5f2a75ed49d65a2ff39f7ca0aa871afb4538f956: Status 404 returned error can't find the container with id 6f2c915a78c6662ff1bfe75a5f2a75ed49d65a2ff39f7ca0aa871afb4538f956 Apr 22 19:52:07.098666 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:07.098631 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" event={"ID":"7b005642-936c-4d56-8085-3d0a756d0fe4","Type":"ContainerStarted","Data":"6f2c915a78c6662ff1bfe75a5f2a75ed49d65a2ff39f7ca0aa871afb4538f956"} Apr 22 19:52:08.103239 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:08.103205 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" event={"ID":"7b005642-936c-4d56-8085-3d0a756d0fe4","Type":"ContainerStarted","Data":"99167853a8f77756452e93e9b99de1c463df3d8fe7b85b73a6a5bd90b8970467"} Apr 22 19:52:08.103239 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:08.103241 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" event={"ID":"7b005642-936c-4d56-8085-3d0a756d0fe4","Type":"ContainerStarted","Data":"a00f1ac355b472c3a5c1336a5f2006df551d648a3efbc0f91b263ddc215c09d7"} Apr 22 19:52:08.122546 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:08.122499 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tvdg/must-gather-wlgs2" podStartSLOduration=1.260469857 podStartE2EDuration="2.122482629s" podCreationTimestamp="2026-04-22 19:52:06 +0000 UTC" firstStartedPulling="2026-04-22 19:52:06.597668135 +0000 UTC m=+3938.597242011" lastFinishedPulling="2026-04-22 19:52:07.459680894 +0000 UTC m=+3939.459254783" observedRunningTime="2026-04-22 19:52:08.120675062 +0000 UTC m=+3940.120248958" watchObservedRunningTime="2026-04-22 19:52:08.122482629 +0000 UTC m=+3940.122056580" Apr 22 19:52:09.219485 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:09.219451 2579 ???:1] "http2: server: error reading preface from client 10.0.137.19:33596: read tcp 10.0.137.19:10250->10.0.137.19:33596: read: connection reset by peer" Apr 22 19:52:09.227878 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:09.227854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hr4mw_1e505d90-2a43-4f9a-a513-c9f1e8c46ac4/global-pull-secret-syncer/0.log" Apr 22 19:52:09.410751 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:09.410719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xwvlb_6fad9739-db9a-49b1-aece-1696145ac1fb/konnectivity-agent/0.log" Apr 22 19:52:09.494341 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:09.494238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-19.ec2.internal_75deb02a1840e8f4bbd2a0c4ef3a9ce4/haproxy/0.log" Apr 22 19:52:13.100692 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:13.100588 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7m87w_52eb41d6-6bcb-4547-bcb0-bb79ad417873/node-exporter/0.log" Apr 22 19:52:13.127175 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:13.127113 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7m87w_52eb41d6-6bcb-4547-bcb0-bb79ad417873/kube-rbac-proxy/0.log" Apr 22 19:52:13.157316 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:13.157289 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7m87w_52eb41d6-6bcb-4547-bcb0-bb79ad417873/init-textfile/0.log" Apr 22 19:52:15.006707 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:15.006679 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-ckmv9_477c8ebb-278f-4a30-9476-d0758c0fce10/networking-console-plugin/0.log" Apr 22 19:52:16.275700 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.275602 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd"] Apr 22 19:52:16.280563 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.280538 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.289005 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.288932 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd"] Apr 22 19:52:16.423072 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.423031 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-proc\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.423408 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.423386 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-podres\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.423618 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.423571 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-lib-modules\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.423714 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.423684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vjtw\" (UniqueName: \"kubernetes.io/projected/3990e45e-c1c1-4b5d-aace-a95399f90ff0-kube-api-access-5vjtw\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.423714 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.423708 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-sys\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524425 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-proc\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524462 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524470 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-podres\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524489 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-lib-modules\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524519 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5vjtw\" (UniqueName: \"kubernetes.io/projected/3990e45e-c1c1-4b5d-aace-a95399f90ff0-kube-api-access-5vjtw\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524540 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-sys\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524553 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-proc\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-sys\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-podres\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.524702 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.524620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3990e45e-c1c1-4b5d-aace-a95399f90ff0-lib-modules\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.537609 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.537542 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vjtw\" (UniqueName: \"kubernetes.io/projected/3990e45e-c1c1-4b5d-aace-a95399f90ff0-kube-api-access-5vjtw\") pod \"perf-node-gather-daemonset-g8vtd\" (UID: \"3990e45e-c1c1-4b5d-aace-a95399f90ff0\") " pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.593996 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.593952 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:16.736583 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:16.736554 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd"] Apr 22 19:52:16.741435 ip-10-0-137-19 kubenswrapper[2579]: W0422 19:52:16.741378 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3990e45e_c1c1_4b5d_aace_a95399f90ff0.slice/crio-b4e6067ff565f9050e7af1219836332d21a55a0f7d42524427614507e80fbf38 WatchSource:0}: Error finding container b4e6067ff565f9050e7af1219836332d21a55a0f7d42524427614507e80fbf38: Status 404 returned error can't find the container with id b4e6067ff565f9050e7af1219836332d21a55a0f7d42524427614507e80fbf38 Apr 22 19:52:17.118517 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.118445 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-22c27_efb38099-2266-40a5-ba8f-a7759b82543b/dns/0.log" Apr 22 19:52:17.137493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.137459 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" event={"ID":"3990e45e-c1c1-4b5d-aace-a95399f90ff0","Type":"ContainerStarted","Data":"b428ff676e23ab32f3cd3acbe7b063021d2693eb2210dbfd14aa8d6cde4c88ac"} Apr 22 19:52:17.137493 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.137498 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" event={"ID":"3990e45e-c1c1-4b5d-aace-a95399f90ff0","Type":"ContainerStarted","Data":"b4e6067ff565f9050e7af1219836332d21a55a0f7d42524427614507e80fbf38"} Apr 22 19:52:17.137703 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.137601 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:17.150223 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.150196 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-22c27_efb38099-2266-40a5-ba8f-a7759b82543b/kube-rbac-proxy/0.log" Apr 22 19:52:17.156667 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.156609 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" podStartSLOduration=1.156592664 podStartE2EDuration="1.156592664s" podCreationTimestamp="2026-04-22 19:52:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 19:52:17.156313777 +0000 UTC m=+3949.155887669" watchObservedRunningTime="2026-04-22 19:52:17.156592664 +0000 UTC m=+3949.156166563" Apr 22 19:52:17.291034 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.290996 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-8vvlg_fdf18fe1-67fc-415f-b637-f1d3a5343441/dns-node-resolver/0.log" Apr 22 19:52:17.900984 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:17.900955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-sdfvz_b1af8ee4-e661-4ab0-ac3d-a1dbdbbd3797/node-ca/0.log" Apr 22 19:52:19.059798 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:19.059772 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9wfwv_85daa685-3b8e-4641-b717-08df86db79f9/serve-healthcheck-canary/0.log" Apr 22 19:52:19.741103 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:19.741079 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7lrq_fb42db1a-d501-4d76-be24-a264eb8f5075/kube-rbac-proxy/0.log" Apr 22 19:52:19.768396 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:19.768372 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7lrq_fb42db1a-d501-4d76-be24-a264eb8f5075/exporter/0.log" Apr 22 19:52:19.795575 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:19.795549 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-j7lrq_fb42db1a-d501-4d76-be24-a264eb8f5075/extractor/0.log" Apr 22 19:52:21.906853 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:21.906816 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-hlcqf_707021df-9a48-4588-a53e-c8ed64b47ad6/server/0.log" Apr 22 19:52:22.169919 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:22.169841 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-gcm2s_b913ff7b-2b42-4862-b0ed-77e64ba21f2d/manager/0.log" Apr 22 19:52:22.213307 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:22.213280 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-q8c7l_a97eb4f8-b9a7-4a20-9294-cfd77103ca1e/s3-init/0.log" Apr 22 19:52:22.302997 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:22.302964 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-2jpp7_acd885e0-7d33-41d6-adaf-66fb859e10b2/seaweedfs/0.log" Apr 22 19:52:22.331471 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:22.331444 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-mlw7c_2a9e4317-9c35-43f1-9440-fdd65d6c362d/seaweedfs-tls-custom/0.log" Apr 22 19:52:22.359023 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:22.358986 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-w8qpf_86e3e93e-024b-4c69-a5bb-0091df5e9a51/seaweedfs-tls-serving/0.log" Apr 22 19:52:23.151033 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:23.151003 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8tvdg/perf-node-gather-daemonset-g8vtd" Apr 22 19:52:28.425302 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.425213 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/kube-multus-additional-cni-plugins/0.log" Apr 22 19:52:28.451158 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.451127 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/egress-router-binary-copy/0.log" Apr 22 19:52:28.477397 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.477375 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/cni-plugins/0.log" Apr 22 19:52:28.503188 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.503157 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/bond-cni-plugin/0.log" Apr 22 19:52:28.529415 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.529386 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/routeoverride-cni/0.log" Apr 22 19:52:28.554646 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.554589 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/whereabouts-cni-bincopy/0.log" Apr 22 19:52:28.581660 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.581630 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-k6ptp_68a8ebb3-0e0f-4205-a22a-a1e895bb8f0b/whereabouts-cni/0.log" Apr 22 19:52:28.652192 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.652146 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-z6889_6ebbe8a2-e463-407a-a400-add3d4b5438a/kube-multus/0.log" Apr 22 19:52:28.749553 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.749512 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n2rv2_4b04d910-b761-4095-a135-7026105ff82f/network-metrics-daemon/0.log" Apr 22 19:52:28.772689 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:28.772661 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-n2rv2_4b04d910-b761-4095-a135-7026105ff82f/kube-rbac-proxy/0.log" Apr 22 19:52:30.026083 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.026047 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/ovn-controller/0.log" Apr 22 19:52:30.071642 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.071610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/ovn-acl-logging/0.log" Apr 22 19:52:30.104799 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.104771 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/kube-rbac-proxy-node/0.log" Apr 22 19:52:30.129665 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.129637 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 19:52:30.153031 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.153006 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/northd/0.log" Apr 22 19:52:30.181589 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.181566 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/nbdb/0.log" Apr 22 19:52:30.207937 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.207910 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/sbdb/0.log" Apr 22 19:52:30.346031 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:30.345932 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j89vb_adf86e5d-38b3-4766-b4e8-e9f7a2380707/ovnkube-controller/0.log" Apr 22 19:52:31.713582 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:31.713551 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-jnbvq_d340dfa0-a9e2-48b1-ad81-8921d5782b2e/network-check-target-container/0.log" Apr 22 19:52:32.653786 ip-10-0-137-19 kubenswrapper[2579]: I0422 19:52:32.653757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-mbb5v_6e6f31ad-2c4e-458c-a3d2-d0367bb85bc8/iptables-alerter/0.log"