Apr 16 20:09:05.678150 ip-10-0-138-118 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 20:09:05.678160 ip-10-0-138-118 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 20:09:05.678167 ip-10-0-138-118 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 20:09:05.678328 ip-10-0-138-118 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 20:09:15.800141 ip-10-0-138-118 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 20:09:15.800158 ip-10-0-138-118 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 560fa0b72b994f95864e1b0968800205 -- Apr 16 20:11:42.589815 ip-10-0-138-118 systemd[1]: Starting Kubernetes Kubelet... Apr 16 20:11:43.040386 ip-10-0-138-118 kubenswrapper[2563]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.040386 ip-10-0-138-118 kubenswrapper[2563]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 20:11:43.040386 ip-10-0-138-118 kubenswrapper[2563]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.040386 ip-10-0-138-118 kubenswrapper[2563]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 20:11:43.040386 ip-10-0-138-118 kubenswrapper[2563]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 20:11:43.043306 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.043223 2563 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 20:11:43.045608 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045594 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.045608 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045609 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045613 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045616 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045619 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045622 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045626 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045628 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045632 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045635 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045637 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045640 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045643 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045645 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045648 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045651 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045653 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045656 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045658 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045661 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045664 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.045672 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045666 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045669 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045672 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045674 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045677 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045680 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045683 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045686 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045689 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045691 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045694 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045696 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045699 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045702 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045704 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045707 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045711 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045715 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045717 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.046141 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045720 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045723 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045725 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045728 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045730 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045733 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045735 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045738 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045740 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045743 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045746 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045749 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045752 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045754 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045757 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045760 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045763 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045767 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045771 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.046639 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045774 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045777 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045780 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045783 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045786 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045789 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045792 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045796 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045799 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045801 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045804 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045806 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045809 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045812 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045814 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045817 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045820 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045823 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045825 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045828 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.047084 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045830 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045833 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045836 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045838 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045841 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045843 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.045846 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046220 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046225 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046228 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046231 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046234 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046236 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046240 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046243 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046247 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046251 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046254 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046257 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046260 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.047553 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046263 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046266 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046268 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046271 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046273 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046276 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046278 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046281 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046284 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046287 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046289 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046292 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046294 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046297 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046299 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046302 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046304 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046307 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046310 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.048038 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046312 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046316 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046319 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046321 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046324 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046326 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046329 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046331 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046334 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046336 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046339 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046341 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046344 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046346 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046348 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046351 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046354 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046356 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046359 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046361 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.048543 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046363 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046367 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046369 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046373 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046377 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046380 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046383 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046386 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046388 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046391 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046394 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046396 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046399 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046402 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046404 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046407 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046410 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046412 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046415 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046417 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.049047 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046421 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046423 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046426 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046428 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046431 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046433 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046436 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046438 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046440 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046443 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046446 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046448 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046451 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.046453 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047012 2563 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047021 2563 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047027 2563 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047031 2563 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047036 2563 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047039 2563 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047043 2563 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 20:11:43.049523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047048 2563 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047051 2563 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047054 2563 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047058 2563 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047062 2563 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047065 2563 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047068 2563 flags.go:64] FLAG: --cgroup-root="" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047071 2563 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047074 2563 flags.go:64] FLAG: --client-ca-file="" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047077 2563 flags.go:64] FLAG: --cloud-config="" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047080 2563 flags.go:64] FLAG: --cloud-provider="external" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047083 2563 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047088 2563 flags.go:64] FLAG: --cluster-domain="" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047090 2563 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047094 2563 flags.go:64] FLAG: --config-dir="" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047096 2563 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047100 2563 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047104 2563 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047107 2563 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047111 2563 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047114 2563 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047117 2563 flags.go:64] FLAG: --contention-profiling="false" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047120 2563 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047123 2563 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047126 2563 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 20:11:43.050027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047129 2563 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047134 2563 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047138 2563 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047140 2563 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047143 2563 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047146 2563 flags.go:64] FLAG: --enable-server="true" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047149 2563 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047153 2563 flags.go:64] FLAG: --event-burst="100" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047156 2563 flags.go:64] FLAG: --event-qps="50" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047159 2563 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047162 2563 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047166 2563 flags.go:64] FLAG: --eviction-hard="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047170 2563 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047173 2563 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047176 2563 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047179 2563 flags.go:64] FLAG: --eviction-soft="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047182 2563 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047185 2563 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047188 2563 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047191 2563 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047194 2563 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047197 2563 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047200 2563 flags.go:64] FLAG: --feature-gates="" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047204 2563 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047207 2563 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 20:11:43.050641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047210 2563 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047213 2563 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047219 2563 flags.go:64] FLAG: --healthz-port="10248" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047222 2563 flags.go:64] FLAG: --help="false" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047225 2563 flags.go:64] FLAG: --hostname-override="ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047228 2563 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047231 2563 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047234 2563 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047237 2563 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047241 2563 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047244 2563 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047247 2563 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047250 2563 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047253 2563 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047256 2563 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047259 2563 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047262 2563 flags.go:64] FLAG: --kube-reserved="" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047265 2563 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047268 2563 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047271 2563 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047274 2563 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047277 2563 flags.go:64] FLAG: --lock-file="" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047280 2563 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047283 2563 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 20:11:43.051217 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047286 2563 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047291 2563 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047294 2563 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047296 2563 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047299 2563 flags.go:64] FLAG: --logging-format="text" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047302 2563 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047305 2563 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047308 2563 flags.go:64] FLAG: --manifest-url="" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047311 2563 flags.go:64] FLAG: --manifest-url-header="" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047315 2563 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047319 2563 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047324 2563 flags.go:64] FLAG: --max-pods="110" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047327 2563 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047330 2563 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047333 2563 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047336 2563 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047339 2563 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047342 2563 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047345 2563 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047353 2563 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047356 2563 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047359 2563 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047362 2563 flags.go:64] FLAG: --pod-cidr="" Apr 16 20:11:43.051820 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047365 2563 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047370 2563 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047373 2563 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047377 2563 flags.go:64] FLAG: --pods-per-core="0" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047380 2563 flags.go:64] FLAG: --port="10250" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047383 2563 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047386 2563 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e651f3d63a526a37" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047389 2563 flags.go:64] FLAG: --qos-reserved="" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047392 2563 flags.go:64] FLAG: --read-only-port="10255" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047397 2563 flags.go:64] FLAG: --register-node="true" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047400 2563 flags.go:64] FLAG: --register-schedulable="true" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047402 2563 flags.go:64] FLAG: --register-with-taints="" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047406 2563 flags.go:64] FLAG: --registry-burst="10" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047409 2563 flags.go:64] FLAG: --registry-qps="5" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047412 2563 flags.go:64] FLAG: --reserved-cpus="" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047415 2563 flags.go:64] FLAG: --reserved-memory="" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047418 2563 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047421 2563 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047424 2563 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047427 2563 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047431 2563 flags.go:64] FLAG: --runonce="false" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047434 2563 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047437 2563 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047440 2563 flags.go:64] FLAG: --seccomp-default="false" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047443 2563 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047446 2563 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 20:11:43.052371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047449 2563 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047452 2563 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047462 2563 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047465 2563 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047468 2563 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047471 2563 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047474 2563 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047476 2563 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047480 2563 flags.go:64] FLAG: --system-cgroups="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047482 2563 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047488 2563 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047491 2563 flags.go:64] FLAG: --tls-cert-file="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047494 2563 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047500 2563 flags.go:64] FLAG: --tls-min-version="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047503 2563 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047507 2563 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047510 2563 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047513 2563 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047516 2563 flags.go:64] FLAG: --v="2" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047520 2563 flags.go:64] FLAG: --version="false" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047524 2563 flags.go:64] FLAG: --vmodule="" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047528 2563 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.047531 2563 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047668 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.052991 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047673 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047676 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047682 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047686 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047689 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047692 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047695 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047697 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047700 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047702 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047705 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047708 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047711 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047713 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047716 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047718 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047721 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047723 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047726 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.053658 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047729 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047731 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047734 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047737 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047744 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047746 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047749 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047752 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047754 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047757 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047760 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047762 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047765 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047785 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047788 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047792 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047795 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047798 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047801 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047806 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.054122 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047809 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047811 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047814 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047817 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047819 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047822 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047825 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047827 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047830 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047832 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047835 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047837 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047840 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047843 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047845 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047848 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047852 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047855 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047857 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.054631 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047860 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047862 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047865 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047868 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047870 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047873 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047875 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047878 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047882 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047885 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047888 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047890 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047893 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047895 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047898 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047900 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047903 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047905 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047908 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047911 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.055277 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047913 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047916 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047919 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047922 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047925 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047929 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.047932 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.048535 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.055950 2563 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 20:11:43.056043 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.055982 2563 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056212 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056231 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056238 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056243 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056257 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.056327 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056264 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056361 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056365 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056368 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056371 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056375 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056377 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056381 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056384 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056387 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056390 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056393 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056395 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056398 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056400 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056403 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056406 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056408 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056412 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056414 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.056476 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056417 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056419 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056422 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056425 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056431 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056434 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056436 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056439 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056442 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056444 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056447 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056450 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056452 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056455 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056459 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056461 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056464 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056469 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056473 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.057016 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056476 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056479 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056483 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056486 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056489 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056492 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056495 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056497 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056500 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056503 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056505 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056508 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056511 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056516 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056519 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056521 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056524 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056527 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056529 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056532 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.057503 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056534 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056537 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056539 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056542 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056545 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056547 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056551 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056568 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056571 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056575 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056577 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056580 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056584 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056586 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056589 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056592 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056594 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056597 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056599 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.057999 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056602 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056605 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056607 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.056613 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056706 2563 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056711 2563 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056714 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056717 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056720 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056723 2563 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056726 2563 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056729 2563 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056731 2563 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056733 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056736 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056739 2563 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 20:11:43.058451 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056742 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056744 2563 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056747 2563 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056749 2563 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056753 2563 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056757 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056760 2563 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056763 2563 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056765 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056768 2563 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056771 2563 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056773 2563 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056776 2563 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056779 2563 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056781 2563 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056784 2563 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056787 2563 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056789 2563 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056791 2563 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 20:11:43.058847 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056794 2563 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056797 2563 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056799 2563 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056802 2563 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056805 2563 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056807 2563 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056810 2563 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056812 2563 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056815 2563 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056817 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056820 2563 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056823 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056825 2563 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056828 2563 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056830 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056833 2563 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056835 2563 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056838 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056841 2563 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056844 2563 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 20:11:43.059312 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056846 2563 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056849 2563 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056852 2563 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056854 2563 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056857 2563 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056860 2563 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056862 2563 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056865 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056867 2563 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056870 2563 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056872 2563 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056874 2563 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056877 2563 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056879 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056882 2563 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056884 2563 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056886 2563 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056889 2563 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056892 2563 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056894 2563 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 20:11:43.059836 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056897 2563 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056899 2563 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056902 2563 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056904 2563 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056907 2563 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056909 2563 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056912 2563 feature_gate.go:328] unrecognized feature gate: Example Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056914 2563 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056918 2563 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056922 2563 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056925 2563 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056929 2563 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056931 2563 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056934 2563 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:43.056937 2563 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.056941 2563 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 20:11:43.060325 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.057605 2563 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 20:11:43.060853 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.059801 2563 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 20:11:43.060853 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.060602 2563 server.go:1019] "Starting client certificate rotation" Apr 16 20:11:43.060853 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.060697 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:43.060853 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.060735 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 20:11:43.084751 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.084732 2563 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:43.087232 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.087213 2563 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 20:11:43.098612 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.098594 2563 log.go:25] "Validated CRI v1 runtime API" Apr 16 20:11:43.103894 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.103879 2563 log.go:25] "Validated CRI v1 image API" Apr 16 20:11:43.105152 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.105139 2563 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 20:11:43.111744 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.111723 2563 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 b53cb772-f9af-4326-aebc-551bfabe5f1c:/dev/nvme0n1p4 cf5c757e-51c5-4abc-bd43-d12c3dc9e9e3:/dev/nvme0n1p3] Apr 16 20:11:43.111826 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.111742 2563 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 20:11:43.115956 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.115788 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:43.117658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.117532 2563 manager.go:217] Machine: {Timestamp:2026-04-16 20:11:43.115785111 +0000 UTC m=+0.404710889 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3106447 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fa090d3183e667c93293a0ba45d16 SystemUUID:ec2fa090-d318-3e66-7c93-293a0ba45d16 BootID:560fa0b7-2b99-4f95-864e-1b0968800205 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:67:2f:2e:37:7b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:67:2f:2e:37:7b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:d6:d2:cc:bb:2a:9d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 20:11:43.117658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.117650 2563 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 20:11:43.117784 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.117743 2563 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 20:11:43.119367 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.119346 2563 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 20:11:43.119495 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.119370 2563 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-118.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 20:11:43.119538 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.119506 2563 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 20:11:43.119538 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.119514 2563 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 20:11:43.119538 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.119527 2563 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:43.120222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.120213 2563 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 20:11:43.120960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.120950 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:43.121066 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.121057 2563 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 20:11:43.123372 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.123363 2563 kubelet.go:491] "Attempting to sync node with API server" Apr 16 20:11:43.123408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.123376 2563 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 20:11:43.123408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.123387 2563 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 20:11:43.123408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.123396 2563 kubelet.go:397] "Adding apiserver pod source" Apr 16 20:11:43.123408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.123404 2563 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 20:11:43.124368 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.124356 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:43.124416 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.124375 2563 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 20:11:43.128072 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.128057 2563 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 20:11:43.129323 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.129307 2563 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 20:11:43.131051 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131038 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131057 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131066 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131075 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131083 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131093 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131101 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131109 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131119 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 20:11:43.131127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131128 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 20:11:43.131390 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131143 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 20:11:43.131390 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.131157 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 20:11:43.132010 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.132000 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 20:11:43.132067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.132013 2563 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 20:11:43.135290 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.135275 2563 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 20:11:43.135364 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.135315 2563 server.go:1295] "Started kubelet" Apr 16 20:11:43.135428 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.135402 2563 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 20:11:43.135489 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.135420 2563 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 20:11:43.135529 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.135511 2563 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 20:11:43.136029 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.136009 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 20:11:43.136253 ip-10-0-138-118 systemd[1]: Started Kubernetes Kubelet. Apr 16 20:11:43.136370 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.136295 2563 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-118.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 20:11:43.136370 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.136303 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 20:11:43.136537 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.136522 2563 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 20:11:43.136836 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.136823 2563 server.go:317] "Adding debug handlers to kubelet server" Apr 16 20:11:43.142103 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.140956 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-118.ec2.internal.18a6ef610bee8c34 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-118.ec2.internal,UID:ip-10-0-138-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-118.ec2.internal,},FirstTimestamp:2026-04-16 20:11:43.135288372 +0000 UTC m=+0.424214149,LastTimestamp:2026-04-16 20:11:43.135288372 +0000 UTC m=+0.424214149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-118.ec2.internal,}" Apr 16 20:11:43.142600 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.142581 2563 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:43.143097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.143080 2563 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 20:11:43.144024 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.144002 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.144251 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144232 2563 factory.go:55] Registering systemd factory Apr 16 20:11:43.144332 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144262 2563 factory.go:223] Registration of the systemd container factory successfully Apr 16 20:11:43.144332 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144236 2563 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 20:11:43.144332 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144311 2563 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 20:11:43.144479 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.144329 2563 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 20:11:43.144479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144380 2563 reconstruct.go:97] "Volume reconstruction finished" Apr 16 20:11:43.144479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144388 2563 reconciler.go:26] "Reconciler: start to sync state" Apr 16 20:11:43.144625 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144497 2563 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 20:11:43.144625 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144587 2563 factory.go:153] Registering CRI-O factory Apr 16 20:11:43.144625 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144598 2563 factory.go:223] Registration of the crio container factory successfully Apr 16 20:11:43.144726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144648 2563 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 20:11:43.144726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144665 2563 factory.go:103] Registering Raw factory Apr 16 20:11:43.144726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.144674 2563 manager.go:1196] Started watching for new ooms in manager Apr 16 20:11:43.145172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.145157 2563 manager.go:319] Starting recovery of all containers Apr 16 20:11:43.152812 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.152770 2563 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 20:11:43.152812 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.152788 2563 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-118.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 20:11:43.156027 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.155873 2563 manager.go:324] Recovery completed Apr 16 20:11:43.160272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.160260 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.163252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163235 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-59znn" Apr 16 20:11:43.163315 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163276 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.163315 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163308 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.163371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163320 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.163892 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163876 2563 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 20:11:43.163892 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163890 2563 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 20:11:43.163995 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.163906 2563 state_mem.go:36] "Initialized new in-memory state store" Apr 16 20:11:43.166466 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.166454 2563 policy_none.go:49] "None policy: Start" Apr 16 20:11:43.166523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.166470 2563 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 20:11:43.166523 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.166479 2563 state_mem.go:35] "Initializing new in-memory state store" Apr 16 20:11:43.171705 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.171684 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-59znn" Apr 16 20:11:43.174052 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.173989 2563 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-118.ec2.internal.18a6ef610d99d5cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-118.ec2.internal,UID:ip-10-0-138-118.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-118.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-118.ec2.internal,},FirstTimestamp:2026-04-16 20:11:43.163291085 +0000 UTC m=+0.452216869,LastTimestamp:2026-04-16 20:11:43.163291085 +0000 UTC m=+0.452216869,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-118.ec2.internal,}" Apr 16 20:11:43.209090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209078 2563 manager.go:341] "Starting Device Plugin manager" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.209113 2563 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209125 2563 server.go:85] "Starting device plugin registration server" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209358 2563 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209369 2563 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209431 2563 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209484 2563 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.209492 2563 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.209945 2563 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 20:11:43.214632 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.209973 2563 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.292855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.292808 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 20:11:43.293947 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.293933 2563 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 20:11:43.294002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.293957 2563 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 20:11:43.294002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.293974 2563 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 20:11:43.294002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.293979 2563 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 20:11:43.294125 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.294008 2563 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 20:11:43.296916 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.296896 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:43.310189 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.310165 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.312759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.312737 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.312828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.312772 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.312828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.312781 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.312828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.312801 2563 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.323678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.323663 2563 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.323720 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.323681 2563 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-118.ec2.internal\": node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.338076 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.338060 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.394347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.394330 2563 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal"] Apr 16 20:11:43.394392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.394382 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.396427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.396408 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.396531 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.396440 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.396531 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.396457 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.398056 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398041 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.398212 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398197 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.398254 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398230 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.398720 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398702 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.398809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398732 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.398809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398742 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.398809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398706 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.398809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398800 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.398809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.398811 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.399808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.399795 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.399849 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.399821 2563 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 20:11:43.400427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.400407 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientMemory" Apr 16 20:11:43.400519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.400438 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 20:11:43.400519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.400452 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeHasSufficientPID" Apr 16 20:11:43.424054 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.424037 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-118.ec2.internal\" not found" node="ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.428290 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.428276 2563 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-118.ec2.internal\" not found" node="ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.438764 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.438749 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.446146 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.446125 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00a800bd34a7cee6861a5791d3f97be3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-118.ec2.internal\" (UID: \"00a800bd34a7cee6861a5791d3f97be3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.446220 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.446153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.446220 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.446179 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.539612 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.539597 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.547012 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.546965 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00a800bd34a7cee6861a5791d3f97be3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-118.ec2.internal\" (UID: \"00a800bd34a7cee6861a5791d3f97be3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.547012 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.546989 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.547012 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.547004 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.547151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.547028 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.547151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.547045 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/00a800bd34a7cee6861a5791d3f97be3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-118.ec2.internal\" (UID: \"00a800bd34a7cee6861a5791d3f97be3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.547151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.547056 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c169960e93078427d6a6be239cc022de-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal\" (UID: \"c169960e93078427d6a6be239cc022de\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.640386 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.640351 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.725929 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.725906 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.730396 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:43.730383 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:43.741075 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.741055 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.841678 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.841624 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:43.942160 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:43.942139 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.042700 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.042679 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.061145 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.061126 2563 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 20:11:44.061253 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.061238 2563 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 20:11:44.143548 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.143497 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.143548 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.143515 2563 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 20:11:44.168109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.168087 2563 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 20:11:44.174362 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.174330 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 20:06:43 +0000 UTC" deadline="2027-12-08 15:59:29.741371676 +0000 UTC" Apr 16 20:11:44.174362 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.174362 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14419h47m45.567013737s" Apr 16 20:11:44.187984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.187968 2563 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-w4k84" Apr 16 20:11:44.190149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.189313 2563 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:44.198169 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.198153 2563 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-w4k84" Apr 16 20:11:44.243814 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.243790 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.256706 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:44.256683 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc169960e93078427d6a6be239cc022de.slice/crio-98a3f505b99c516990d0fa1d8ddc0d591ebe4f9ee0851856e5bb7209f8528021 WatchSource:0}: Error finding container 98a3f505b99c516990d0fa1d8ddc0d591ebe4f9ee0851856e5bb7209f8528021: Status 404 returned error can't find the container with id 98a3f505b99c516990d0fa1d8ddc0d591ebe4f9ee0851856e5bb7209f8528021 Apr 16 20:11:44.257286 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:44.257266 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00a800bd34a7cee6861a5791d3f97be3.slice/crio-e2ef819bd7e4ed8258054611ee319acf36d21e8c09a94730f9310c86a3ea47ca WatchSource:0}: Error finding container e2ef819bd7e4ed8258054611ee319acf36d21e8c09a94730f9310c86a3ea47ca: Status 404 returned error can't find the container with id e2ef819bd7e4ed8258054611ee319acf36d21e8c09a94730f9310c86a3ea47ca Apr 16 20:11:44.261427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.261411 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:11:44.296357 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.296306 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" event={"ID":"00a800bd34a7cee6861a5791d3f97be3","Type":"ContainerStarted","Data":"e2ef819bd7e4ed8258054611ee319acf36d21e8c09a94730f9310c86a3ea47ca"} Apr 16 20:11:44.297315 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.297293 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" event={"ID":"c169960e93078427d6a6be239cc022de","Type":"ContainerStarted","Data":"98a3f505b99c516990d0fa1d8ddc0d591ebe4f9ee0851856e5bb7209f8528021"} Apr 16 20:11:44.305369 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.305355 2563 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:44.344515 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.344497 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.444927 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.444876 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.545347 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:44.545318 2563 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-118.ec2.internal\" not found" Apr 16 20:11:44.561266 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.561247 2563 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:44.643544 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.643521 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" Apr 16 20:11:44.654030 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.654010 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:44.654988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.654957 2563 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" Apr 16 20:11:44.666366 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.666337 2563 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 20:11:44.951920 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:44.951890 2563 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 20:11:45.125259 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.125230 2563 apiserver.go:52] "Watching apiserver" Apr 16 20:11:45.133393 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.133367 2563 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 20:11:45.135194 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.135171 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7","openshift-cluster-node-tuning-operator/tuned-x9vx5","openshift-dns/node-resolver-bh4x8","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal","openshift-multus/multus-tnt6p","openshift-ovn-kubernetes/ovnkube-node-qjb9s","kube-system/konnectivity-agent-tgsc9","openshift-image-registry/node-ca-btw62","openshift-multus/multus-additional-cni-plugins-krhbv","openshift-multus/network-metrics-daemon-mx2qh","openshift-network-diagnostics/network-check-target-ssfhx","openshift-network-operator/iptables-alerter-6mc4g"] Apr 16 20:11:45.137204 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.137184 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.138285 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.138265 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.139344 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.139287 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.140521 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.140486 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.141021 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141003 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.141125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141110 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 20:11:45.141294 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141279 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-rn2s6\"" Apr 16 20:11:45.141364 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141314 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.141667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141424 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.141667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141488 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.141667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141543 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 20:11:45.141667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141632 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 20:11:45.141911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141672 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 20:11:45.141911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141682 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 20:11:45.141911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.141870 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-qhfrq\"" Apr 16 20:11:45.142067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.142050 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.142764 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.142734 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-zmnkz\"" Apr 16 20:11:45.142863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.142745 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.142863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.142822 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.142863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.142822 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.143762 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.143740 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.143933 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.143740 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-v9s6x\"" Apr 16 20:11:45.144415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.144293 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.145856 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.145656 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 20:11:45.145856 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.145730 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.146051 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.146032 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.146100 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.146066 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-c8fnz\"" Apr 16 20:11:45.146408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.146392 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 20:11:45.146994 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.146804 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 20:11:45.147641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.147603 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-99qj6\"" Apr 16 20:11:45.147827 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.147803 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 20:11:45.148352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.148321 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.148462 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.148446 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.149777 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.149757 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.149865 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.149826 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:45.150702 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.150686 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-mlz62\"" Apr 16 20:11:45.151011 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.150997 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.151107 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151089 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:45.151198 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.151146 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:45.151271 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151197 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.151271 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151245 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 20:11:45.151271 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151264 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-zj42d\"" Apr 16 20:11:45.151446 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151273 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 20:11:45.151518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.151489 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 20:11:45.152681 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.152663 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.154938 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.154922 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 20:11:45.155133 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.155115 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:11:45.155133 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.155129 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 20:11:45.155401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.155386 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-4j5tt\"" Apr 16 20:11:45.156669 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156651 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-netns\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156679 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-etc-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156695 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-node-log\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156717 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-script-lib\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156742 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-modprobe-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156765 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/172b2c56-3bf0-4eef-aab2-4934181bce38-host\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156795 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156821 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-var-lib-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156845 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-env-overrides\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156866 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k775\" (UniqueName: \"kubernetes.io/projected/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-kube-api-access-4k775\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156886 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-systemd-units\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156907 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-log-socket\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156931 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.156961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156956 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-socket-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.156999 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-device-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157031 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157055 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-var-lib-kubelet\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157079 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-k8s-cni-cncf-io\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157119 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-tmp\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-netd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157180 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wv26\" (UniqueName: \"kubernetes.io/projected/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-kube-api-access-5wv26\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157207 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysconfig\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157237 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-cnibin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157280 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-netns\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157308 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157305 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-bin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157329 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-kubelet\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157351 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-systemd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157383 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cnibin\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157422 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-system-cni-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157447 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-system-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157469 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157490 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d5dcfae-693b-4c79-8475-17131d139947-konnectivity-ca\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157543 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157602 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovn-node-metrics-cert\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157627 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-systemd\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157649 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-host\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157691 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-cni-binary-copy\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157723 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpnzp\" (UniqueName: \"kubernetes.io/projected/e7efb583-4245-4d53-b571-eaf057bac81b-kube-api-access-gpnzp\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.157759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157747 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d5dcfae-693b-4c79-8475-17131d139947-agent-certs\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157770 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/172b2c56-3bf0-4eef-aab2-4934181bce38-serviceca\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157795 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48db7bb0-6c87-484f-b5df-58ae1720d8f9-hosts-file\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157843 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-multus\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157884 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-slash\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157908 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-os-release\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157934 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2mzk\" (UniqueName: \"kubernetes.io/projected/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-kube-api-access-m2mzk\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157965 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-sys\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.157996 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-conf-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158017 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-multus-daemon-config\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158039 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-etc-kubernetes\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158073 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-kubelet\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158088 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158102 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-config\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158115 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k27gk\" (UniqueName: \"kubernetes.io/projected/172b2c56-3bf0-4eef-aab2-4934181bce38-kube-api-access-k27gk\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158153 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-sys-fs\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158200 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-kubernetes\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.158245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158237 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-conf\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158275 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-run\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158296 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-lib-modules\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158326 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158368 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsfxf\" (UniqueName: \"kubernetes.io/projected/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kube-api-access-lsfxf\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158393 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48db7bb0-6c87-484f-b5df-58ae1720d8f9-tmp-dir\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158419 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-hostroot\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158441 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-multus-certs\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158455 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158476 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158499 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158522 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158544 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-registration-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158582 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-ovn\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158604 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-bin\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158639 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th5hh\" (UniqueName: \"kubernetes.io/projected/48db7bb0-6c87-484f-b5df-58ae1720d8f9-kube-api-access-th5hh\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.158907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158700 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-socket-dir-parent\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.159525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158753 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-tuned\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.159525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.158779 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-os-release\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.199593 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.199571 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:44 +0000 UTC" deadline="2027-11-12 09:07:57.08311938 +0000 UTC" Apr 16 20:11:45.199593 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.199592 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13788h56m11.883529687s" Apr 16 20:11:45.245255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.245198 2563 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 20:11:45.259060 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259030 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-netns\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259066 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-etc-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259088 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-node-log\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259103 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-script-lib\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259118 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-modprobe-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259139 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/172b2c56-3bf0-4eef-aab2-4934181bce38-host\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.259157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259142 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-netns\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259161 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-etc-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259163 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259218 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bl7lm\" (UniqueName: \"kubernetes.io/projected/0bd61e48-9d65-473a-b8b7-da6980e29685-kube-api-access-bl7lm\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259165 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-node-log\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259249 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-var-lib-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259287 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-var-lib-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259287 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-env-overrides\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259345 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/172b2c56-3bf0-4eef-aab2-4934181bce38-host\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259373 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k775\" (UniqueName: \"kubernetes.io/projected/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-kube-api-access-4k775\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.259417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259409 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-systemd-units\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259429 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-log-socket\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259463 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-modprobe-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259470 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-log-socket\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259505 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-systemd-units\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259586 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259625 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-socket-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259645 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259652 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-device-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259696 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259718 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-var-lib-kubelet\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259719 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-env-overrides\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259718 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259751 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-k8s-cni-cncf-io\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259754 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-socket-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259794 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-device-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259810 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-k8s-cni-cncf-io\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.259848 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259829 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-script-lib\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259851 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-tmp\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259834 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-var-lib-kubelet\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259864 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259906 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-netd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259942 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-netd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.259979 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5wv26\" (UniqueName: \"kubernetes.io/projected/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-kube-api-access-5wv26\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260005 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysconfig\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260030 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-cnibin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-netns\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260082 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-bin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260109 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-kubelet\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260115 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysconfig\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260133 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-systemd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260127 2563 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260161 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cnibin\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260173 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-bin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260187 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-kubelet\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.260606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260190 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpkw\" (UniqueName: \"kubernetes.io/projected/51782696-d22a-4882-9ad3-4de29c66583c-kube-api-access-fzpkw\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260218 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-netns\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260150 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-cnibin\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260230 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260263 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-system-cni-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260268 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-systemd\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260288 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-system-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260265 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cnibin\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260311 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260346 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d5dcfae-693b-4c79-8475-17131d139947-konnectivity-ca\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260362 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260375 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260380 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-system-cni-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260395 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-system-cni-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260406 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260435 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd61e48-9d65-473a-b8b7-da6980e29685-iptables-alerter-script\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260464 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovn-node-metrics-cert\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.261420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260477 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-d\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260489 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-systemd\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260514 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-host\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260538 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-cni-binary-copy\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260576 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gpnzp\" (UniqueName: \"kubernetes.io/projected/e7efb583-4245-4d53-b571-eaf057bac81b-kube-api-access-gpnzp\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260601 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d5dcfae-693b-4c79-8475-17131d139947-agent-certs\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260624 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/172b2c56-3bf0-4eef-aab2-4934181bce38-serviceca\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260647 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48db7bb0-6c87-484f-b5df-58ae1720d8f9-hosts-file\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260672 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-multus\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260696 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-slash\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260700 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-systemd\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260722 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-os-release\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m2mzk\" (UniqueName: \"kubernetes.io/projected/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-kube-api-access-m2mzk\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260750 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-host\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260800 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/48db7bb0-6c87-484f-b5df-58ae1720d8f9-hosts-file\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260847 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-sys\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260866 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/2d5dcfae-693b-4c79-8475-17131d139947-konnectivity-ca\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260872 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-conf-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.262206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260918 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-conf-dir\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260929 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-multus-daemon-config\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260955 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-etc-kubernetes\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.260989 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-kubelet\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261012 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261041 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-config\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261065 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k27gk\" (UniqueName: \"kubernetes.io/projected/172b2c56-3bf0-4eef-aab2-4934181bce38-kube-api-access-k27gk\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261087 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-sys-fs\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261124 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-kubernetes\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261154 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-conf\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261190 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-run\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261213 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-lib-modules\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261236 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261259 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsfxf\" (UniqueName: \"kubernetes.io/projected/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kube-api-access-lsfxf\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261283 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48db7bb0-6c87-484f-b5df-58ae1720d8f9-tmp-dir\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261291 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-cni-binary-copy\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261306 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-hostroot\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261333 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-multus-certs\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261350 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-var-lib-cni-multus\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261359 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd61e48-9d65-473a-b8b7-da6980e29685-host-slash\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261398 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-slash\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261428 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261454 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261512 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261527 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/172b2c56-3bf0-4eef-aab2-4934181bce38-serviceca\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261539 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261586 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-registration-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261604 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-sys\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261612 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-ovn\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261638 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-bin\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261651 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-etc-kubernetes\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261664 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-th5hh\" (UniqueName: \"kubernetes.io/projected/48db7bb0-6c87-484f-b5df-58ae1720d8f9-kube-api-access-th5hh\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261690 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-kubelet\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261690 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-socket-dir-parent\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261722 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-tuned\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.263876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261739 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-os-release\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261740 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-multus-socket-dir-parent\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.261949 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-openvswitch\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262205 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-run-ovn-kubernetes\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262203 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262329 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-run-ovn\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262420 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovnkube-config\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262424 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e7efb583-4245-4d53-b571-eaf057bac81b-multus-daemon-config\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262471 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-run\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262510 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-kubernetes\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262544 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-hostroot\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262607 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-sys-fs\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262690 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-os-release\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262734 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-os-release\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262745 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262771 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e7efb583-4245-4d53-b571-eaf057bac81b-host-run-multus-certs\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262792 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-registration-dir\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262806 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-host-cni-bin\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.264707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262865 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262884 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-sysctl-conf\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262885 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/48db7bb0-6c87-484f-b5df-58ae1720d8f9-tmp-dir\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262908 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.262960 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-lib-modules\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.263669 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-ovn-node-metrics-cert\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.264028 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/2d5dcfae-693b-4c79-8475-17131d139947-agent-certs\") pod \"konnectivity-agent-tgsc9\" (UID: \"2d5dcfae-693b-4c79-8475-17131d139947\") " pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.264081 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-tmp\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.265347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.265054 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-etc-tuned\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.271441 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.271388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k775\" (UniqueName: \"kubernetes.io/projected/cd72324a-d342-4d7b-8fde-e0e8a56bbe39-kube-api-access-4k775\") pod \"tuned-x9vx5\" (UID: \"cd72324a-d342-4d7b-8fde-e0e8a56bbe39\") " pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.272407 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.272313 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wv26\" (UniqueName: \"kubernetes.io/projected/3d468fb0-6c11-4fba-b1e4-ef75ae52d254-kube-api-access-5wv26\") pod \"ovnkube-node-qjb9s\" (UID: \"3d468fb0-6c11-4fba-b1e4-ef75ae52d254\") " pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.272785 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.272766 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k27gk\" (UniqueName: \"kubernetes.io/projected/172b2c56-3bf0-4eef-aab2-4934181bce38-kube-api-access-k27gk\") pod \"node-ca-btw62\" (UID: \"172b2c56-3bf0-4eef-aab2-4934181bce38\") " pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.273309 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.273263 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2mzk\" (UniqueName: \"kubernetes.io/projected/02dbfbd4-16bb-4990-8e97-87e6ff7a47f1-kube-api-access-m2mzk\") pod \"multus-additional-cni-plugins-krhbv\" (UID: \"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1\") " pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.273446 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.273427 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpnzp\" (UniqueName: \"kubernetes.io/projected/e7efb583-4245-4d53-b571-eaf057bac81b-kube-api-access-gpnzp\") pod \"multus-tnt6p\" (UID: \"e7efb583-4245-4d53-b571-eaf057bac81b\") " pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.273610 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.273592 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-th5hh\" (UniqueName: \"kubernetes.io/projected/48db7bb0-6c87-484f-b5df-58ae1720d8f9-kube-api-access-th5hh\") pod \"node-resolver-bh4x8\" (UID: \"48db7bb0-6c87-484f-b5df-58ae1720d8f9\") " pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.273672 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.273612 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsfxf\" (UniqueName: \"kubernetes.io/projected/c31a15f0-4c2b-4236-86e5-e92fbcf467e5-kube-api-access-lsfxf\") pod \"aws-ebs-csi-driver-node-54qm7\" (UID: \"c31a15f0-4c2b-4236-86e5-e92fbcf467e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.362461 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362429 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.362461 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362457 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd61e48-9d65-473a-b8b7-da6980e29685-iptables-alerter-script\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.362678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362483 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd61e48-9d65-473a-b8b7-da6980e29685-host-slash\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.362678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362536 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0bd61e48-9d65-473a-b8b7-da6980e29685-host-slash\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.362678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362584 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bl7lm\" (UniqueName: \"kubernetes.io/projected/0bd61e48-9d65-473a-b8b7-da6980e29685-kube-api-access-bl7lm\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.362678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362625 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpkw\" (UniqueName: \"kubernetes.io/projected/51782696-d22a-4882-9ad3-4de29c66583c-kube-api-access-fzpkw\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.362863 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.362698 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.362863 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.362778 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.862744218 +0000 UTC m=+3.151669984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.362863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.362805 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:45.363031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.363012 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/0bd61e48-9d65-473a-b8b7-da6980e29685-iptables-alerter-script\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.370265 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.370244 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:45.370265 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.370266 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:45.370410 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.370277 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.370410 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.370329 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:45.870316751 +0000 UTC m=+3.159242519 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.372179 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.372152 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpkw\" (UniqueName: \"kubernetes.io/projected/51782696-d22a-4882-9ad3-4de29c66583c-kube-api-access-fzpkw\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.373139 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.373123 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bl7lm\" (UniqueName: \"kubernetes.io/projected/0bd61e48-9d65-473a-b8b7-da6980e29685-kube-api-access-bl7lm\") pod \"iptables-alerter-6mc4g\" (UID: \"0bd61e48-9d65-473a-b8b7-da6980e29685\") " pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.450021 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.449995 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:11:45.456777 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.456755 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" Apr 16 20:11:45.464307 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.464286 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" Apr 16 20:11:45.468818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.468797 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-bh4x8" Apr 16 20:11:45.476469 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.476451 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-tnt6p" Apr 16 20:11:45.482971 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.482952 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:45.488482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.488455 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-btw62" Apr 16 20:11:45.494950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.494930 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-krhbv" Apr 16 20:11:45.501472 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.501450 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-6mc4g" Apr 16 20:11:45.851469 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.851266 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7efb583_4245_4d53_b571_eaf057bac81b.slice/crio-cea20e0aa963097f724107bfce5ea83c2a32498b2b15f997a927526ab4c13372 WatchSource:0}: Error finding container cea20e0aa963097f724107bfce5ea83c2a32498b2b15f997a927526ab4c13372: Status 404 returned error can't find the container with id cea20e0aa963097f724107bfce5ea83c2a32498b2b15f997a927526ab4c13372 Apr 16 20:11:45.853069 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.853001 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bd61e48_9d65_473a_b8b7_da6980e29685.slice/crio-7d00d776aa98824ab20f83327d2141efc3c3fe6de3113e5829fcbf5febeef980 WatchSource:0}: Error finding container 7d00d776aa98824ab20f83327d2141efc3c3fe6de3113e5829fcbf5febeef980: Status 404 returned error can't find the container with id 7d00d776aa98824ab20f83327d2141efc3c3fe6de3113e5829fcbf5febeef980 Apr 16 20:11:45.856193 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.856169 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd72324a_d342_4d7b_8fde_e0e8a56bbe39.slice/crio-99fe415da7ef152ecfdc220a581e171321d13fe084f6f56c20a45ef9220cb36c WatchSource:0}: Error finding container 99fe415da7ef152ecfdc220a581e171321d13fe084f6f56c20a45ef9220cb36c: Status 404 returned error can't find the container with id 99fe415da7ef152ecfdc220a581e171321d13fe084f6f56c20a45ef9220cb36c Apr 16 20:11:45.857077 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.857046 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d468fb0_6c11_4fba_b1e4_ef75ae52d254.slice/crio-25008c2700ce70bda22d5d466c0055d6bd1cff9505ff5fb6975d0e7248417424 WatchSource:0}: Error finding container 25008c2700ce70bda22d5d466c0055d6bd1cff9505ff5fb6975d0e7248417424: Status 404 returned error can't find the container with id 25008c2700ce70bda22d5d466c0055d6bd1cff9505ff5fb6975d0e7248417424 Apr 16 20:11:45.858024 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.857917 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48db7bb0_6c87_484f_b5df_58ae1720d8f9.slice/crio-7d87f0b33b100447de985d67bc0c934a91824ddf63c03e9efb476b2b213cdb14 WatchSource:0}: Error finding container 7d87f0b33b100447de985d67bc0c934a91824ddf63c03e9efb476b2b213cdb14: Status 404 returned error can't find the container with id 7d87f0b33b100447de985d67bc0c934a91824ddf63c03e9efb476b2b213cdb14 Apr 16 20:11:45.858913 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.858819 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod172b2c56_3bf0_4eef_aab2_4934181bce38.slice/crio-4de32190ce97a61078c64aa7053cb127695a194005a52158c1bd179699f0ba84 WatchSource:0}: Error finding container 4de32190ce97a61078c64aa7053cb127695a194005a52158c1bd179699f0ba84: Status 404 returned error can't find the container with id 4de32190ce97a61078c64aa7053cb127695a194005a52158c1bd179699f0ba84 Apr 16 20:11:45.859494 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.859477 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02dbfbd4_16bb_4990_8e97_87e6ff7a47f1.slice/crio-ba361c5b1ea4143b96383b0227bc094c48f95c077fe5ba93224cd310cadb8877 WatchSource:0}: Error finding container ba361c5b1ea4143b96383b0227bc094c48f95c077fe5ba93224cd310cadb8877: Status 404 returned error can't find the container with id ba361c5b1ea4143b96383b0227bc094c48f95c077fe5ba93224cd310cadb8877 Apr 16 20:11:45.860605 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.860579 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d5dcfae_693b_4c79_8475_17131d139947.slice/crio-2bb598e64802d050339a0cedda42a80f01ecb218cbd49c815e3e31c497a12236 WatchSource:0}: Error finding container 2bb598e64802d050339a0cedda42a80f01ecb218cbd49c815e3e31c497a12236: Status 404 returned error can't find the container with id 2bb598e64802d050339a0cedda42a80f01ecb218cbd49c815e3e31c497a12236 Apr 16 20:11:45.862761 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:11:45.862601 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc31a15f0_4c2b_4236_86e5_e92fbcf467e5.slice/crio-6cae45151f15706e35931f7c35d574542ee8697c7bfa19b667a89a1fd2b487a3 WatchSource:0}: Error finding container 6cae45151f15706e35931f7c35d574542ee8697c7bfa19b667a89a1fd2b487a3: Status 404 returned error can't find the container with id 6cae45151f15706e35931f7c35d574542ee8697c7bfa19b667a89a1fd2b487a3 Apr 16 20:11:45.866672 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.866643 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:45.866889 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.866784 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.866889 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.866833 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:46.866815532 +0000 UTC m=+4.155741300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:45.967859 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:45.967725 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:45.967958 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.967860 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:45.967958 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.967875 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:45.967958 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.967884 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:45.967958 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:45.967925 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:46.967911818 +0000 UTC m=+4.256837586 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.200388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.200290 2563 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 20:06:44 +0000 UTC" deadline="2027-11-22 15:12:25.563337303 +0000 UTC" Apr 16 20:11:46.200388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.200317 2563 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14035h0m39.36302241s" Apr 16 20:11:46.294877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.294274 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:46.294877 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.294403 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:46.294877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.294271 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:46.294877 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.294833 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:46.316381 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.316324 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tgsc9" event={"ID":"2d5dcfae-693b-4c79-8475-17131d139947","Type":"ContainerStarted","Data":"2bb598e64802d050339a0cedda42a80f01ecb218cbd49c815e3e31c497a12236"} Apr 16 20:11:46.326097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.326054 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bh4x8" event={"ID":"48db7bb0-6c87-484f-b5df-58ae1720d8f9","Type":"ContainerStarted","Data":"7d87f0b33b100447de985d67bc0c934a91824ddf63c03e9efb476b2b213cdb14"} Apr 16 20:11:46.335851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.335828 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6mc4g" event={"ID":"0bd61e48-9d65-473a-b8b7-da6980e29685","Type":"ContainerStarted","Data":"7d00d776aa98824ab20f83327d2141efc3c3fe6de3113e5829fcbf5febeef980"} Apr 16 20:11:46.343138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.343118 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tnt6p" event={"ID":"e7efb583-4245-4d53-b571-eaf057bac81b","Type":"ContainerStarted","Data":"cea20e0aa963097f724107bfce5ea83c2a32498b2b15f997a927526ab4c13372"} Apr 16 20:11:46.356288 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.356262 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" event={"ID":"c31a15f0-4c2b-4236-86e5-e92fbcf467e5","Type":"ContainerStarted","Data":"6cae45151f15706e35931f7c35d574542ee8697c7bfa19b667a89a1fd2b487a3"} Apr 16 20:11:46.359236 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.359214 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerStarted","Data":"ba361c5b1ea4143b96383b0227bc094c48f95c077fe5ba93224cd310cadb8877"} Apr 16 20:11:46.365728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.365704 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-btw62" event={"ID":"172b2c56-3bf0-4eef-aab2-4934181bce38","Type":"ContainerStarted","Data":"4de32190ce97a61078c64aa7053cb127695a194005a52158c1bd179699f0ba84"} Apr 16 20:11:46.378606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.377804 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"25008c2700ce70bda22d5d466c0055d6bd1cff9505ff5fb6975d0e7248417424"} Apr 16 20:11:46.391408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.391387 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" event={"ID":"cd72324a-d342-4d7b-8fde-e0e8a56bbe39","Type":"ContainerStarted","Data":"99fe415da7ef152ecfdc220a581e171321d13fe084f6f56c20a45ef9220cb36c"} Apr 16 20:11:46.401598 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.399104 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" event={"ID":"00a800bd34a7cee6861a5791d3f97be3","Type":"ContainerStarted","Data":"8b83bc3b88e5f2f7b2813a79877f794fd25f1ab9775f8c8744c946dc0176385d"} Apr 16 20:11:46.877683 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.877652 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:46.877800 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.877788 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.877865 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.877847 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:48.877829788 +0000 UTC m=+6.166755555 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:46.978274 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:46.978196 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:46.978420 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.978350 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:46.978420 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.978368 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:46.978420 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.978381 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:46.978599 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:46.978437 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:48.978419272 +0000 UTC m=+6.267345077 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:47.413951 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:47.413863 2563 generic.go:358] "Generic (PLEG): container finished" podID="c169960e93078427d6a6be239cc022de" containerID="7fcabbb2ecb7c0691f6c2ac80451f0acd30228e86e624b2577a80dd1da0feedc" exitCode=0 Apr 16 20:11:47.418649 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:47.418608 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" event={"ID":"c169960e93078427d6a6be239cc022de","Type":"ContainerDied","Data":"7fcabbb2ecb7c0691f6c2ac80451f0acd30228e86e624b2577a80dd1da0feedc"} Apr 16 20:11:47.433992 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:47.433836 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-118.ec2.internal" podStartSLOduration=3.433819943 podStartE2EDuration="3.433819943s" podCreationTimestamp="2026-04-16 20:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:46.413751235 +0000 UTC m=+3.702677021" watchObservedRunningTime="2026-04-16 20:11:47.433819943 +0000 UTC m=+4.722745729" Apr 16 20:11:48.294756 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:48.294619 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:48.294756 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:48.294671 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:48.294955 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.294749 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:48.295177 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.295153 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:48.419899 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:48.419865 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" event={"ID":"c169960e93078427d6a6be239cc022de","Type":"ContainerStarted","Data":"255e22dc57b6219681aa25145e553cacedf5610898395a8c0f973b1531408034"} Apr 16 20:11:48.892912 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:48.892878 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:48.893062 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.893034 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:48.893117 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.893098 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:11:52.893079387 +0000 UTC m=+10.182005149 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:48.993635 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:48.993596 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:48.993791 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.993775 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:48.993850 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.993797 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:48.993850 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.993810 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:48.993956 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:48.993868 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:11:52.99385106 +0000 UTC m=+10.282776843 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:50.294322 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:50.294291 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:50.294770 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:50.294429 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:50.294818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:50.294291 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:50.295106 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:50.295061 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:52.295036 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:52.295003 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:52.295497 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:52.295009 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:52.295497 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:52.295118 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:52.295497 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:52.295219 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:52.920911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:52.920875 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:52.921099 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:52.921039 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:52.921159 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:52.921105 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.921086322 +0000 UTC m=+18.210012094 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:11:53.022620 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:53.022211 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:53.022620 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:53.022360 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:11:53.022620 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:53.022380 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:11:53.022620 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:53.022391 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:53.022620 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:53.022443 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:01.022425332 +0000 UTC m=+18.311351111 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:11:54.295207 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:54.295171 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:54.295546 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:54.295290 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:54.295728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:54.295706 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:54.295835 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:54.295814 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:54.432412 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:54.432132 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" event={"ID":"cd72324a-d342-4d7b-8fde-e0e8a56bbe39","Type":"ContainerStarted","Data":"97659696e0d69f70644c7d0f4207010343c9d08d4388ed65626cf25e330c862a"} Apr 16 20:11:54.448423 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:54.448030 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-118.ec2.internal" podStartSLOduration=10.448013577 podStartE2EDuration="10.448013577s" podCreationTimestamp="2026-04-16 20:11:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:11:48.435493514 +0000 UTC m=+5.724419300" watchObservedRunningTime="2026-04-16 20:11:54.448013577 +0000 UTC m=+11.736939364" Apr 16 20:11:55.436287 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.435974 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" event={"ID":"c31a15f0-4c2b-4236-86e5-e92fbcf467e5","Type":"ContainerStarted","Data":"d8ef271d9c5b763aeef6b930630c0690efa59c8978d33a445d284bff7167fdc3"} Apr 16 20:11:55.438627 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.438596 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="54ce55f37baeb73f06a32dfc2b4ae5c0f78aad8d973539913862de966535cd74" exitCode=0 Apr 16 20:11:55.438777 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.438673 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"54ce55f37baeb73f06a32dfc2b4ae5c0f78aad8d973539913862de966535cd74"} Apr 16 20:11:55.440786 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.440584 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-btw62" event={"ID":"172b2c56-3bf0-4eef-aab2-4934181bce38","Type":"ContainerStarted","Data":"1242f71b8eca528be0074d7732d2d2d9d9fca1d52f86adade982e1eecc791fcb"} Apr 16 20:11:55.442153 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.442125 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tgsc9" event={"ID":"2d5dcfae-693b-4c79-8475-17131d139947","Type":"ContainerStarted","Data":"59c61855f196e6a2ff7215c980e35e2169dcc199c9e2a15a102e43c995f7953e"} Apr 16 20:11:55.443424 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.443402 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-bh4x8" event={"ID":"48db7bb0-6c87-484f-b5df-58ae1720d8f9","Type":"ContainerStarted","Data":"9b0caa1ec2962a0a1e37f827b863eda4e521bd6ab430b0f1d5f443e8c8aaef73"} Apr 16 20:11:55.457704 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.457633 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-x9vx5" podStartSLOduration=4.11374703 podStartE2EDuration="12.4576181s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.858042161 +0000 UTC m=+3.146967924" lastFinishedPulling="2026-04-16 20:11:54.201913228 +0000 UTC m=+11.490838994" observedRunningTime="2026-04-16 20:11:54.448153113 +0000 UTC m=+11.737078898" watchObservedRunningTime="2026-04-16 20:11:55.4576181 +0000 UTC m=+12.746543888" Apr 16 20:11:55.486081 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.486036 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tgsc9" podStartSLOduration=4.186815733 podStartE2EDuration="12.486022273s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.863978393 +0000 UTC m=+3.152904172" lastFinishedPulling="2026-04-16 20:11:54.163184945 +0000 UTC m=+11.452110712" observedRunningTime="2026-04-16 20:11:55.46968587 +0000 UTC m=+12.758611655" watchObservedRunningTime="2026-04-16 20:11:55.486022273 +0000 UTC m=+12.774948099" Apr 16 20:11:55.486197 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.486116 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-btw62" podStartSLOduration=4.170392873 podStartE2EDuration="12.48611049s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.86176683 +0000 UTC m=+3.150692607" lastFinishedPulling="2026-04-16 20:11:54.177484452 +0000 UTC m=+11.466410224" observedRunningTime="2026-04-16 20:11:55.484573511 +0000 UTC m=+12.773499294" watchObservedRunningTime="2026-04-16 20:11:55.48611049 +0000 UTC m=+12.775036275" Apr 16 20:11:55.497897 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.497854 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-bh4x8" podStartSLOduration=4.175570186 podStartE2EDuration="12.497842796s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.86060887 +0000 UTC m=+3.149534635" lastFinishedPulling="2026-04-16 20:11:54.182881477 +0000 UTC m=+11.471807245" observedRunningTime="2026-04-16 20:11:55.497609338 +0000 UTC m=+12.786535133" watchObservedRunningTime="2026-04-16 20:11:55.497842796 +0000 UTC m=+12.786768579" Apr 16 20:11:55.924726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.924696 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:55.925389 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:55.925365 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:56.294839 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.294769 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:56.294982 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.294770 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:56.294982 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:56.294879 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:56.294982 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:56.294960 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:56.419778 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.419752 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-87vlr"] Apr 16 20:11:56.432683 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.432658 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.432797 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:56.432735 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:11:56.446655 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.446626 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-6mc4g" event={"ID":"0bd61e48-9d65-473a-b8b7-da6980e29685","Type":"ContainerStarted","Data":"90c5d39655eed0f86f91c4c42857fa7ecf8f3846113f25e2c58f1c0ace6157fe"} Apr 16 20:11:56.447098 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.447088 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:56.447602 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.447582 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tgsc9" Apr 16 20:11:56.546211 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.546140 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.546211 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.546206 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-dbus\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.546374 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.546285 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-kubelet-config\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.646924 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.646893 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.647099 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.646951 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-dbus\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.647099 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.646984 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-kubelet-config\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.647099 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:56.647048 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:56.647099 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.647064 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-kubelet-config\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:56.647280 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:56.647121 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:57.147101018 +0000 UTC m=+14.436026795 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:56.647280 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:56.647230 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/f5c86052-4f3b-4b92-9618-f53193a55301-dbus\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:57.152663 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:57.152628 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:57.152820 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:57.152766 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:57.152879 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:57.152850 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:11:58.152835068 +0000 UTC m=+15.441760843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.159812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:58.159776 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:58.160440 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:58.159935 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.160440 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:58.160001 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:00.159982374 +0000 UTC m=+17.448908141 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:11:58.294840 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:58.294811 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:11:58.294994 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:58.294924 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:11:58.294994 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:58.294929 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:11:58.295107 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:58.295043 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:11:58.295107 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:11:58.295072 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:11:58.295205 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:11:58.295139 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:00.177791 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:00.177753 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:00.178400 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.177905 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:00.178400 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.177977 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:04.177957351 +0000 UTC m=+21.466883114 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:00.294616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:00.294586 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:00.294616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:00.294586 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:00.294807 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:00.294586 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:00.294807 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.294704 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:00.294807 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.294758 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:00.294930 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.294859 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:00.984311 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:00.984277 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:00.984488 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.984455 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:00.984541 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:00.984534 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.984513744 +0000 UTC m=+34.273439520 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 20:12:01.085011 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:01.084978 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:01.085205 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:01.085164 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 20:12:01.085205 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:01.085192 2563 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 20:12:01.085294 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:01.085210 2563 projected.go:194] Error preparing data for projected volume kube-api-access-5h7nf for pod openshift-network-diagnostics/network-check-target-ssfhx: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:01.085294 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:01.085277 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf podName:34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a nodeName:}" failed. No retries permitted until 2026-04-16 20:12:17.085257105 +0000 UTC m=+34.374182889 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-5h7nf" (UniqueName: "kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf") pod "network-check-target-ssfhx" (UID: "34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 20:12:02.294474 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:02.294435 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:02.294474 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:02.294471 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:02.295015 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:02.294586 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:02.295015 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:02.294591 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:02.295015 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:02.294682 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:02.295015 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:02.294776 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:03.074090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.073808 2563 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 20:12:03.219079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.218807 2563 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T20:12:03.074086579Z","UUID":"1a4f837e-08fd-49e3-9826-997eb91e648e","Handler":null,"Name":"","Endpoint":""} Apr 16 20:12:03.220461 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.220443 2563 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 20:12:03.220551 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.220483 2563 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 20:12:03.459687 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.459650 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-tnt6p" event={"ID":"e7efb583-4245-4d53-b571-eaf057bac81b","Type":"ContainerStarted","Data":"8b5b65c1e8e34262e9b0948c2388e2b99f7158ef305b7badb64bd1cba3b1f763"} Apr 16 20:12:03.461653 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.461624 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" event={"ID":"c31a15f0-4c2b-4236-86e5-e92fbcf467e5","Type":"ContainerStarted","Data":"0993477c8ebfe8a11fe10e698ed8821e56e1eec5579d3f41fb12253977dd4715"} Apr 16 20:12:03.464383 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464355 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"a61f6ef3bd2e05d9418ffc0c6b13801842d110a27a0873e924df3b12453059a7"} Apr 16 20:12:03.464383 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464387 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"1bfddba937d3d4cac1b62c742b05d45ae65c7afa587f36f7e0bf5b4c47bc6ef5"} Apr 16 20:12:03.464534 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464398 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"276ae83cb509eced5342b2f316b4787d60cba7badef1ac33935e593a78a33c3c"} Apr 16 20:12:03.464534 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464408 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"2860a67c447fd280bc8ca37c9f352753d637ec323c4a6668f338a528e9b5d105"} Apr 16 20:12:03.464534 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464416 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"117b54b1ad5614e42ba500332a53dace27b531be998924b8530eea1f7844dc1c"} Apr 16 20:12:03.464534 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.464427 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"9f96878fdff6fc156cc134847730ac27d2a8caade96e23b1e3a4c2b37bfe4fa5"} Apr 16 20:12:03.478426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.478224 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-6mc4g" podStartSLOduration=12.156721958 podStartE2EDuration="20.478211252s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.855995738 +0000 UTC m=+3.144921503" lastFinishedPulling="2026-04-16 20:11:54.177485024 +0000 UTC m=+11.466410797" observedRunningTime="2026-04-16 20:11:56.488721492 +0000 UTC m=+13.777647282" watchObservedRunningTime="2026-04-16 20:12:03.478211252 +0000 UTC m=+20.767137041" Apr 16 20:12:03.478713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:03.478680 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-tnt6p" podStartSLOduration=3.450351751 podStartE2EDuration="20.478672201s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.853036761 +0000 UTC m=+3.141962533" lastFinishedPulling="2026-04-16 20:12:02.88135722 +0000 UTC m=+20.170282983" observedRunningTime="2026-04-16 20:12:03.477883589 +0000 UTC m=+20.766809370" watchObservedRunningTime="2026-04-16 20:12:03.478672201 +0000 UTC m=+20.767597985" Apr 16 20:12:04.208903 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:04.208877 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:04.209046 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:04.208974 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:04.209046 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:04.209029 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:12.209014201 +0000 UTC m=+29.497939973 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:04.294272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:04.294246 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:04.294408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:04.294246 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:04.294408 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:04.294343 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:04.294502 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:04.294419 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:04.294502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:04.294245 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:04.294639 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:04.294513 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:04.468013 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:04.467927 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" event={"ID":"c31a15f0-4c2b-4236-86e5-e92fbcf467e5","Type":"ContainerStarted","Data":"e2391b025364a353e6b83a84ce1a032bbc6eee46381472c5625287fd58513fb3"} Apr 16 20:12:06.294824 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:06.294804 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:06.295109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:06.294812 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:06.295109 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:06.294896 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:06.295109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:06.294812 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:06.295109 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:06.294974 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:06.295109 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:06.295023 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:06.473083 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:06.473059 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"b7acf00047e9832c73864de2d2c04905e42e07d7f5cfb4adffbd8e1e852e434c"} Apr 16 20:12:07.476229 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:07.476200 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="d5b276fb209b7c38964faaf5e71b554067057fa9eb13aa28e8581b9199ce3de5" exitCode=0 Apr 16 20:12:07.476667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:07.476240 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"d5b276fb209b7c38964faaf5e71b554067057fa9eb13aa28e8581b9199ce3de5"} Apr 16 20:12:07.509002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:07.508964 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-54qm7" podStartSLOduration=6.475247157 podStartE2EDuration="24.508951602s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.865397258 +0000 UTC m=+3.154323031" lastFinishedPulling="2026-04-16 20:12:03.899101714 +0000 UTC m=+21.188027476" observedRunningTime="2026-04-16 20:12:04.483869729 +0000 UTC m=+21.772795513" watchObservedRunningTime="2026-04-16 20:12:07.508951602 +0000 UTC m=+24.797877397" Apr 16 20:12:08.294301 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.294276 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:08.294428 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.294283 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:08.294428 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:08.294363 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:08.294510 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:08.294434 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:08.294510 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.294464 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:08.294602 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:08.294523 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:08.479917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.479468 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="ef7fed7f9f7b0f2747a1d82960279e8ea8c21cdc574f329aad25603b907384e7" exitCode=0 Apr 16 20:12:08.479917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.479538 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"ef7fed7f9f7b0f2747a1d82960279e8ea8c21cdc574f329aad25603b907384e7"} Apr 16 20:12:08.484378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.482988 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" event={"ID":"3d468fb0-6c11-4fba-b1e4-ef75ae52d254","Type":"ContainerStarted","Data":"0df03fbb533b7bbb4eda791062eba821a043050a0d946018b6de916b7c31679e"} Apr 16 20:12:08.484378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.483304 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:08.484378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.483321 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:08.497438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.496921 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:08.569057 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:08.568834 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" podStartSLOduration=8.546514384 podStartE2EDuration="25.568820825s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.859051899 +0000 UTC m=+3.147977665" lastFinishedPulling="2026-04-16 20:12:02.881358335 +0000 UTC m=+20.170284106" observedRunningTime="2026-04-16 20:12:08.567346489 +0000 UTC m=+25.856272271" watchObservedRunningTime="2026-04-16 20:12:08.568820825 +0000 UTC m=+25.857746634" Apr 16 20:12:09.486517 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:09.486494 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="5cde45814160bee594a94926a9ee8c88ef58f536fa699a0ba62cd6887e6c1313" exitCode=0 Apr 16 20:12:09.486849 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:09.486589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"5cde45814160bee594a94926a9ee8c88ef58f536fa699a0ba62cd6887e6c1313"} Apr 16 20:12:09.486982 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:09.486966 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:09.500013 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:09.499996 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:10.295067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.295039 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:10.295067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.295066 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:10.295303 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.295047 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:10.295303 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.295154 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:10.295303 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.295238 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:10.295428 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.295315 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:10.863405 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.863377 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87vlr"] Apr 16 20:12:10.863842 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.863483 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:10.863842 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.863612 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:10.865401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.865377 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mx2qh"] Apr 16 20:12:10.865510 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.865469 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:10.865612 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.865587 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:10.877041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.876835 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ssfhx"] Apr 16 20:12:10.877041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:10.876913 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:10.877041 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:10.877001 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:12.271131 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:12.271103 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:12.271577 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:12.271216 2563 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:12.271577 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:12.271273 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret podName:f5c86052-4f3b-4b92-9618-f53193a55301 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:28.271254694 +0000 UTC m=+45.560180456 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret") pod "global-pull-secret-syncer-87vlr" (UID: "f5c86052-4f3b-4b92-9618-f53193a55301") : object "kube-system"/"original-pull-secret" not registered Apr 16 20:12:12.294924 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:12.294893 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:12.295037 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:12.295016 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:12.295101 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:12.295075 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:12.295202 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:12.295180 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:13.295586 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:13.295545 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:13.296088 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:13.296060 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:12:14.294628 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.294602 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:14.294742 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.294724 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-87vlr" podUID="f5c86052-4f3b-4b92-9618-f53193a55301" Apr 16 20:12:14.294780 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.294742 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:14.294892 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.294849 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ssfhx" podUID="34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a" Apr 16 20:12:14.522693 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.522666 2563 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-118.ec2.internal" event="NodeReady" Apr 16 20:12:14.523089 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.522793 2563 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 20:12:14.558195 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.557973 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-58f999fc8b-87hbq"] Apr 16 20:12:14.593053 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.593026 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-58f999fc8b-87hbq"] Apr 16 20:12:14.593053 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.593058 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wfhdc"] Apr 16 20:12:14.593250 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.593184 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.598348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.597968 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-kmdps\"" Apr 16 20:12:14.598348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.598195 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 20:12:14.598348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.598275 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 20:12:14.604224 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.604198 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 20:12:14.607588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.606734 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 20:12:14.613083 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.613063 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-qsrv5"] Apr 16 20:12:14.613196 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.613179 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.618236 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.618218 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:12:14.618337 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.618291 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 20:12:14.619219 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.619200 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 20:12:14.636270 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.636250 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wfhdc"] Apr 16 20:12:14.636270 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.636271 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsrv5"] Apr 16 20:12:14.636450 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.636360 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:14.642479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.642367 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:12:14.642479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.642444 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 20:12:14.642838 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.642816 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 20:12:14.643053 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.643039 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 20:12:14.690143 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690112 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690256 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690151 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690256 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690179 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690256 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690212 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed1e1b27-b156-463d-9ee6-eaa33682d57c-tmp-dir\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.690377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690316 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690348 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690373 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690509 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690400 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.690509 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690437 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed1e1b27-b156-463d-9ee6-eaa33682d57c-config-volume\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.690509 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690476 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.690509 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690503 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ck5h\" (UniqueName: \"kubernetes.io/projected/ed1e1b27-b156-463d-9ee6-eaa33682d57c-kube-api-access-4ck5h\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.690681 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.690532 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gpgl\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791398 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791372 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791403 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791419 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791440 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791465 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed1e1b27-b156-463d-9ee6-eaa33682d57c-config-volume\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791488 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.791525 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791511 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ck5h\" (UniqueName: \"kubernetes.io/projected/ed1e1b27-b156-463d-9ee6-eaa33682d57c-kube-api-access-4ck5h\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791534 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2gpgl\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.791549 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.791577 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791579 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fzq9\" (UniqueName: \"kubernetes.io/projected/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-kube-api-access-4fzq9\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.791655 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.791687 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.291667823 +0000 UTC m=+32.580593598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.791726 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.291707874 +0000 UTC m=+32.580633656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791656 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791767 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791800 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:14.791828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791833 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.792259 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.791882 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed1e1b27-b156-463d-9ee6-eaa33682d57c-tmp-dir\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.792743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.792710 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed1e1b27-b156-463d-9ee6-eaa33682d57c-config-volume\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.792942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.792920 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.793019 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.792957 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.793079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.793047 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed1e1b27-b156-463d-9ee6-eaa33682d57c-tmp-dir\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.793275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.793255 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.796318 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.796302 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.796318 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.796309 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.802435 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.802414 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.803069 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.802714 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gpgl\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:14.803069 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.803041 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ck5h\" (UniqueName: \"kubernetes.io/projected/ed1e1b27-b156-463d-9ee6-eaa33682d57c-kube-api-access-4ck5h\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:14.892465 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.892389 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:14.892636 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.892498 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4fzq9\" (UniqueName: \"kubernetes.io/projected/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-kube-api-access-4fzq9\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:14.892636 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.892545 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:14.892636 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:14.892630 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:15.392609108 +0000 UTC m=+32.681534882 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:14.905946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:14.905921 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fzq9\" (UniqueName: \"kubernetes.io/projected/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-kube-api-access-4fzq9\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:15.294355 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.294326 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:15.294501 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.294359 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:15.294501 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.294397 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:15.294501 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.294471 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:15.294501 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.294488 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:15.294689 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.294537 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.294520786 +0000 UTC m=+33.583446561 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:15.294689 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.294578 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:15.294689 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.294641 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.294623334 +0000 UTC m=+33.583549097 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:15.297198 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.297176 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 20:12:15.297354 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.297339 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:12:15.394978 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:15.394958 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:15.395150 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.395135 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:15.395212 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:15.395193 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:16.395175405 +0000 UTC m=+33.684101182 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:16.295007 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.294981 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:16.295648 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.295160 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:16.298062 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.298048 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 20:12:16.299092 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.299076 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-8rq6t\"" Apr 16 20:12:16.299176 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.299078 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 20:12:16.299176 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.299078 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 20:12:16.301259 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.301238 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:16.301375 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.301278 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:16.301375 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.301371 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:16.301609 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.301377 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:16.301609 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.301394 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:16.301609 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.301427 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:18.301410302 +0000 UTC m=+35.590336079 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:16.301609 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.301453 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:18.301435735 +0000 UTC m=+35.590361515 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:16.402263 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.402241 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:16.402341 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.402318 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:16.402377 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:16.402354 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:18.402343642 +0000 UTC m=+35.691269405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:16.503431 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.503412 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="4042ac50cb92ea68223b70a869ea5426a7083bc680b07b74f621950a085da05d" exitCode=0 Apr 16 20:12:16.503522 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:16.503461 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"4042ac50cb92ea68223b70a869ea5426a7083bc680b07b74f621950a085da05d"} Apr 16 20:12:17.004891 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.004859 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:17.005056 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:17.004994 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:17.005094 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:17.005057 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:49.005040649 +0000 UTC m=+66.293966411 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : secret "metrics-daemon-secret" not found Apr 16 20:12:17.105863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.105838 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:17.108280 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.108262 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5h7nf\" (UniqueName: \"kubernetes.io/projected/34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a-kube-api-access-5h7nf\") pod \"network-check-target-ssfhx\" (UID: \"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a\") " pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:17.208630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.208613 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:17.327009 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.326793 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ssfhx"] Apr 16 20:12:17.331551 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:12:17.331528 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34c9ee5c_a94b_41cc_8dc1_9d7ff0ef981a.slice/crio-244b8c9a85a90a51abd57cc5a241b88ee2ec2952936d068bb8e7cbacfb40ca13 WatchSource:0}: Error finding container 244b8c9a85a90a51abd57cc5a241b88ee2ec2952936d068bb8e7cbacfb40ca13: Status 404 returned error can't find the container with id 244b8c9a85a90a51abd57cc5a241b88ee2ec2952936d068bb8e7cbacfb40ca13 Apr 16 20:12:17.507487 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.507458 2563 generic.go:358] "Generic (PLEG): container finished" podID="02dbfbd4-16bb-4990-8e97-87e6ff7a47f1" containerID="891715bb4bb6c4c9827f23208f030bf97436078dee6913484c4d2b6604604fbf" exitCode=0 Apr 16 20:12:17.507613 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.507534 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerDied","Data":"891715bb4bb6c4c9827f23208f030bf97436078dee6913484c4d2b6604604fbf"} Apr 16 20:12:17.508621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:17.508597 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ssfhx" event={"ID":"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a","Type":"ContainerStarted","Data":"244b8c9a85a90a51abd57cc5a241b88ee2ec2952936d068bb8e7cbacfb40ca13"} Apr 16 20:12:18.314668 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:18.314632 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:18.314871 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:18.314675 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:18.314871 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.314784 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:18.314871 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.314802 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:18.314871 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.314805 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:18.314871 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.314867 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.314848542 +0000 UTC m=+39.603774324 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:18.315084 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.314885 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.314877892 +0000 UTC m=+39.603803656 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:18.415197 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:18.415165 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:18.415610 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.415326 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:18.415610 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:18.415396 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:22.415376471 +0000 UTC m=+39.704302240 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:18.514015 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:18.513984 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-krhbv" event={"ID":"02dbfbd4-16bb-4990-8e97-87e6ff7a47f1","Type":"ContainerStarted","Data":"06ab4c43a3876b848dd10f92c4334bd565d311fc41e12d55e2a461b548e57451"} Apr 16 20:12:18.540942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:18.540900 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-krhbv" podStartSLOduration=5.899451574 podStartE2EDuration="35.540885723s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:11:45.864362687 +0000 UTC m=+3.153288465" lastFinishedPulling="2026-04-16 20:12:15.505796848 +0000 UTC m=+32.794722614" observedRunningTime="2026-04-16 20:12:18.539281686 +0000 UTC m=+35.828207505" watchObservedRunningTime="2026-04-16 20:12:18.540885723 +0000 UTC m=+35.829811506" Apr 16 20:12:21.522490 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:21.522259 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ssfhx" event={"ID":"34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a","Type":"ContainerStarted","Data":"00bd395871a1e644e39d2717e7020ceb68ae96c1b76edba92c5c95c15d06d3ab"} Apr 16 20:12:21.522814 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:21.522506 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:12:21.541151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:21.541110 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ssfhx" podStartSLOduration=35.350524596 podStartE2EDuration="38.541098178s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:12:17.333466377 +0000 UTC m=+34.622392140" lastFinishedPulling="2026-04-16 20:12:20.524039955 +0000 UTC m=+37.812965722" observedRunningTime="2026-04-16 20:12:21.540094936 +0000 UTC m=+38.829020729" watchObservedRunningTime="2026-04-16 20:12:21.541098178 +0000 UTC m=+38.830023965" Apr 16 20:12:22.343538 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:22.343502 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:22.343538 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:22.343540 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:22.343750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.343641 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:22.343750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.343646 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:22.343750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.343663 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:22.343750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.343693 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.343677941 +0000 UTC m=+47.632603703 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:22.343750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.343706 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.343700627 +0000 UTC m=+47.632626389 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:22.444479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:22.444456 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:22.444601 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.444592 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:22.444647 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:22.444631 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:30.444620025 +0000 UTC m=+47.733545792 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:28.286068 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:28.286035 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:28.289910 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:28.289884 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/f5c86052-4f3b-4b92-9618-f53193a55301-original-pull-secret\") pod \"global-pull-secret-syncer-87vlr\" (UID: \"f5c86052-4f3b-4b92-9618-f53193a55301\") " pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:28.303988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:28.303969 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-87vlr" Apr 16 20:12:28.414037 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:28.414011 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-87vlr"] Apr 16 20:12:28.416923 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:12:28.416894 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c86052_4f3b_4b92_9618_f53193a55301.slice/crio-e427c7329110dc544068b8a1e4ed9397c5ccdf6d95589611817d682add910406 WatchSource:0}: Error finding container e427c7329110dc544068b8a1e4ed9397c5ccdf6d95589611817d682add910406: Status 404 returned error can't find the container with id e427c7329110dc544068b8a1e4ed9397c5ccdf6d95589611817d682add910406 Apr 16 20:12:28.535184 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:28.535152 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87vlr" event={"ID":"f5c86052-4f3b-4b92-9618-f53193a55301","Type":"ContainerStarted","Data":"e427c7329110dc544068b8a1e4ed9397c5ccdf6d95589611817d682add910406"} Apr 16 20:12:30.402260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:30.402231 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:30.402267 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.402370 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.402390 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.402413 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.402423 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.402407512 +0000 UTC m=+63.691333298 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:30.402701 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.402477 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.402459368 +0000 UTC m=+63.691385144 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:30.503186 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:30.503154 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:30.503342 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.503301 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:30.503404 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:30.503393 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:12:46.503370256 +0000 UTC m=+63.792296032 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:33.544926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:33.544895 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-87vlr" event={"ID":"f5c86052-4f3b-4b92-9618-f53193a55301","Type":"ContainerStarted","Data":"a5e913993e35032c4c560bc85974e9d8f22deaba90faf776873d6db6c23677a9"} Apr 16 20:12:33.561497 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:33.561459 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-87vlr" podStartSLOduration=33.274708801 podStartE2EDuration="37.561448006s" podCreationTimestamp="2026-04-16 20:11:56 +0000 UTC" firstStartedPulling="2026-04-16 20:12:28.41853976 +0000 UTC m=+45.707465522" lastFinishedPulling="2026-04-16 20:12:32.705278954 +0000 UTC m=+49.994204727" observedRunningTime="2026-04-16 20:12:33.560884843 +0000 UTC m=+50.849810631" watchObservedRunningTime="2026-04-16 20:12:33.561448006 +0000 UTC m=+50.850373789" Apr 16 20:12:41.503013 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:41.502987 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qjb9s" Apr 16 20:12:46.410633 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:46.410601 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:46.410639 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.410739 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.410753 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.410797 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:13:18.410784484 +0000 UTC m=+95.699710246 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.410756 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:12:46.410984 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.410867 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:13:18.410853516 +0000 UTC m=+95.699779281 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:12:46.511082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:46.511051 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:12:46.511197 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.511171 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:12:46.511233 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:46.511217 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:13:18.511204493 +0000 UTC m=+95.800130274 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:12:49.028891 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:49.028855 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:12:49.029247 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:49.028989 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:12:49.029247 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:12:49.029049 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:13:53.029033842 +0000 UTC m=+130.317959604 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : secret "metrics-daemon-secret" not found Apr 16 20:12:52.526549 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:12:52.526517 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ssfhx" Apr 16 20:13:18.420641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:13:18.420491 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:13:18.420641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:13:18.420588 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:13:18.421168 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.420680 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:13:18.421168 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.420692 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:13:18.421168 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.420755 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:22.420739708 +0000 UTC m=+159.709665477 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:13:18.421168 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.420751 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:13:18.421168 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.420812 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:14:22.420796736 +0000 UTC m=+159.709722514 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:13:18.521363 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:13:18.521339 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:13:18.521513 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.521466 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:13:18.521570 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:18.521518 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:14:22.521503484 +0000 UTC m=+159.810429248 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:13:53.061017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:13:53.060969 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:13:53.061491 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:53.061091 2563 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 20:13:53.061491 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:13:53.061157 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs podName:51782696-d22a-4882-9ad3-4de29c66583c nodeName:}" failed. No retries permitted until 2026-04-16 20:15:55.061143549 +0000 UTC m=+252.350069311 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs") pod "network-metrics-daemon-mx2qh" (UID: "51782696-d22a-4882-9ad3-4de29c66583c") : secret "metrics-daemon-secret" not found Apr 16 20:14:12.528856 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.528825 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb"] Apr 16 20:14:12.531585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.531569 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.593555 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.593527 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb"] Apr 16 20:14:12.612921 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.612901 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 20:14:12.613018 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.613002 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-dmqj2\"" Apr 16 20:14:12.614082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.614068 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 20:14:12.614132 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.614070 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 20:14:12.614227 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.614209 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 20:14:12.682937 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.682914 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49451082-0796-4a50-af11-a585eef9af8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.683052 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.682956 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49451082-0796-4a50-af11-a585eef9af8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.683052 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.682975 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4npb\" (UniqueName: \"kubernetes.io/projected/49451082-0796-4a50-af11-a585eef9af8c-kube-api-access-b4npb\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.783270 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.783212 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49451082-0796-4a50-af11-a585eef9af8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.783270 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.783238 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4npb\" (UniqueName: \"kubernetes.io/projected/49451082-0796-4a50-af11-a585eef9af8c-kube-api-access-b4npb\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.783400 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.783297 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49451082-0796-4a50-af11-a585eef9af8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.783831 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.783813 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49451082-0796-4a50-af11-a585eef9af8c-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.785405 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.785388 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49451082-0796-4a50-af11-a585eef9af8c-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.791921 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.791899 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4npb\" (UniqueName: \"kubernetes.io/projected/49451082-0796-4a50-af11-a585eef9af8c-kube-api-access-b4npb\") pod \"kube-storage-version-migrator-operator-6769c5d45-v8mkb\" (UID: \"49451082-0796-4a50-af11-a585eef9af8c\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.839220 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.839191 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" Apr 16 20:14:12.952043 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:12.952013 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb"] Apr 16 20:14:12.955297 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:12.955268 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49451082_0796_4a50_af11_a585eef9af8c.slice/crio-2dcd48b35daeb821a75c0591e908d5eccdcfd9400ed37b2edbed030ee949c0f6 WatchSource:0}: Error finding container 2dcd48b35daeb821a75c0591e908d5eccdcfd9400ed37b2edbed030ee949c0f6: Status 404 returned error can't find the container with id 2dcd48b35daeb821a75c0591e908d5eccdcfd9400ed37b2edbed030ee949c0f6 Apr 16 20:14:13.727546 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:13.727503 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" event={"ID":"49451082-0796-4a50-af11-a585eef9af8c","Type":"ContainerStarted","Data":"2dcd48b35daeb821a75c0591e908d5eccdcfd9400ed37b2edbed030ee949c0f6"} Apr 16 20:14:14.731079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:14.731046 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" event={"ID":"49451082-0796-4a50-af11-a585eef9af8c","Type":"ContainerStarted","Data":"52d21edc02df6cbfb1c1eec8ac13d62d7c8352f776beb05e7fde1c30a49d8e50"} Apr 16 20:14:14.749519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:14.749477 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" podStartSLOduration=1.111443798 podStartE2EDuration="2.749463895s" podCreationTimestamp="2026-04-16 20:14:12 +0000 UTC" firstStartedPulling="2026-04-16 20:14:12.956683224 +0000 UTC m=+150.245608987" lastFinishedPulling="2026-04-16 20:14:14.59470332 +0000 UTC m=+151.883629084" observedRunningTime="2026-04-16 20:14:14.748811624 +0000 UTC m=+152.037737410" watchObservedRunningTime="2026-04-16 20:14:14.749463895 +0000 UTC m=+152.038389731" Apr 16 20:14:16.383796 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.383768 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d"] Apr 16 20:14:16.386508 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.386493 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" Apr 16 20:14:16.389025 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.389002 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4q4ml\"" Apr 16 20:14:16.395761 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.395742 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d"] Apr 16 20:14:16.509223 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.509196 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcbmw\" (UniqueName: \"kubernetes.io/projected/973151c8-de39-4310-b2e4-204c7f502b48-kube-api-access-hcbmw\") pod \"network-check-source-8894fc9bd-t9s2d\" (UID: \"973151c8-de39-4310-b2e4-204c7f502b48\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" Apr 16 20:14:16.610144 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.610112 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcbmw\" (UniqueName: \"kubernetes.io/projected/973151c8-de39-4310-b2e4-204c7f502b48-kube-api-access-hcbmw\") pod \"network-check-source-8894fc9bd-t9s2d\" (UID: \"973151c8-de39-4310-b2e4-204c7f502b48\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" Apr 16 20:14:16.618576 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.618541 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcbmw\" (UniqueName: \"kubernetes.io/projected/973151c8-de39-4310-b2e4-204c7f502b48-kube-api-access-hcbmw\") pod \"network-check-source-8894fc9bd-t9s2d\" (UID: \"973151c8-de39-4310-b2e4-204c7f502b48\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" Apr 16 20:14:16.694486 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.694464 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" Apr 16 20:14:16.803673 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:16.803642 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d"] Apr 16 20:14:16.806368 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:16.806336 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973151c8_de39_4310_b2e4_204c7f502b48.slice/crio-adf0bccbab858f2add5dcbd18bb91e3659c42dfb7ec581a1e6898ae06c415274 WatchSource:0}: Error finding container adf0bccbab858f2add5dcbd18bb91e3659c42dfb7ec581a1e6898ae06c415274: Status 404 returned error can't find the container with id adf0bccbab858f2add5dcbd18bb91e3659c42dfb7ec581a1e6898ae06c415274 Apr 16 20:14:17.606295 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:17.606248 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" podUID="a36a7e81-30a8-46f8-b55e-9a5b61290032" Apr 16 20:14:17.622519 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:17.622494 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wfhdc" podUID="ed1e1b27-b156-463d-9ee6-eaa33682d57c" Apr 16 20:14:17.644956 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:17.644934 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-qsrv5" podUID="d78ddec9-9c5c-40a0-b5b1-d748cb8a110c" Apr 16 20:14:17.741609 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:17.741576 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" event={"ID":"973151c8-de39-4310-b2e4-204c7f502b48","Type":"ContainerStarted","Data":"7d600527b8347613315721ccb5b9b3700a0bd062bfc4983c3773e65f0469f6bb"} Apr 16 20:14:17.741802 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:17.741615 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" event={"ID":"973151c8-de39-4310-b2e4-204c7f502b48","Type":"ContainerStarted","Data":"adf0bccbab858f2add5dcbd18bb91e3659c42dfb7ec581a1e6898ae06c415274"} Apr 16 20:14:17.741802 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:17.741626 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:14:17.759659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:17.759621 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-t9s2d" podStartSLOduration=1.759611088 podStartE2EDuration="1.759611088s" podCreationTimestamp="2026-04-16 20:14:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:14:17.759037037 +0000 UTC m=+155.047962815" watchObservedRunningTime="2026-04-16 20:14:17.759611088 +0000 UTC m=+155.048536872" Apr 16 20:14:18.302986 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:18.302948 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-mx2qh" podUID="51782696-d22a-4882-9ad3-4de29c66583c" Apr 16 20:14:18.492990 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.492961 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l5rfr"] Apr 16 20:14:18.496112 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.496086 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.499612 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.499585 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 20:14:18.499815 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.499795 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 20:14:18.500639 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.500622 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-cjh6r\"" Apr 16 20:14:18.500726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.500665 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 20:14:18.500726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.500719 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 20:14:18.505324 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.505306 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l5rfr"] Apr 16 20:14:18.599751 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.599697 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk"] Apr 16 20:14:18.602436 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.602423 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.604960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.604941 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 20:14:18.604960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.604956 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 20:14:18.605113 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.605053 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-zrhjp\"" Apr 16 20:14:18.611789 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.611769 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk"] Apr 16 20:14:18.627081 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.627065 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-cabundle\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.627165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.627088 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tssr\" (UniqueName: \"kubernetes.io/projected/d1c4282e-fd5a-4971-b838-55e6f45dc544-kube-api-access-6tssr\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.627165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.627111 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-key\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.727704 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.727684 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.727791 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.727736 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.727831 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.727812 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-cabundle\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.727862 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.727833 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tssr\" (UniqueName: \"kubernetes.io/projected/d1c4282e-fd5a-4971-b838-55e6f45dc544-kube-api-access-6tssr\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.727862 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.727855 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-key\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.728290 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.728274 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-cabundle\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.730110 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.730090 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/d1c4282e-fd5a-4971-b838-55e6f45dc544-signing-key\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.736451 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.736428 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tssr\" (UniqueName: \"kubernetes.io/projected/d1c4282e-fd5a-4971-b838-55e6f45dc544-kube-api-access-6tssr\") pod \"service-ca-865cb79987-l5rfr\" (UID: \"d1c4282e-fd5a-4971-b838-55e6f45dc544\") " pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.804832 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.804803 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-l5rfr" Apr 16 20:14:18.828310 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.828287 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.828411 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.828371 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.828459 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:18.828435 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:18.828499 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:18.828490 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert podName:eee002f1-2d78-4f01-b6c8-7f7b9567f19b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:19.328475976 +0000 UTC m=+156.617401743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nvxzk" (UID: "eee002f1-2d78-4f01-b6c8-7f7b9567f19b") : secret "networking-console-plugin-cert" not found Apr 16 20:14:18.829013 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.828996 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:18.916611 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:18.916583 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-l5rfr"] Apr 16 20:14:18.919458 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:18.919432 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1c4282e_fd5a_4971_b838_55e6f45dc544.slice/crio-f6e777fe67eed92d86d9f91ba93fda387bfc28646cfcda97e6a76f430af3e234 WatchSource:0}: Error finding container f6e777fe67eed92d86d9f91ba93fda387bfc28646cfcda97e6a76f430af3e234: Status 404 returned error can't find the container with id f6e777fe67eed92d86d9f91ba93fda387bfc28646cfcda97e6a76f430af3e234 Apr 16 20:14:19.025426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:19.025398 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bh4x8_48db7bb0-6c87-484f-b5df-58ae1720d8f9/dns-node-resolver/0.log" Apr 16 20:14:19.333251 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:19.333217 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:19.333434 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:19.333371 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:19.333505 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:19.333443 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert podName:eee002f1-2d78-4f01-b6c8-7f7b9567f19b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:20.333426706 +0000 UTC m=+157.622352469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nvxzk" (UID: "eee002f1-2d78-4f01-b6c8-7f7b9567f19b") : secret "networking-console-plugin-cert" not found Apr 16 20:14:19.747082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:19.747045 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-l5rfr" event={"ID":"d1c4282e-fd5a-4971-b838-55e6f45dc544","Type":"ContainerStarted","Data":"f6e777fe67eed92d86d9f91ba93fda387bfc28646cfcda97e6a76f430af3e234"} Apr 16 20:14:20.226577 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:20.226536 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-btw62_172b2c56-3bf0-4eef-aab2-4934181bce38/node-ca/0.log" Apr 16 20:14:20.342911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:20.342878 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:20.343095 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:20.343045 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:20.343158 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:20.343122 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert podName:eee002f1-2d78-4f01-b6c8-7f7b9567f19b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:22.343100767 +0000 UTC m=+159.632026540 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nvxzk" (UID: "eee002f1-2d78-4f01-b6c8-7f7b9567f19b") : secret "networking-console-plugin-cert" not found Apr 16 20:14:20.750069 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:20.749984 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-l5rfr" event={"ID":"d1c4282e-fd5a-4971-b838-55e6f45dc544","Type":"ContainerStarted","Data":"549f56bf8913a843a1e842e336b6a4ad67dfa6c4b0e0f2175ecd44032a2a86c3"} Apr 16 20:14:20.767323 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:20.767271 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-l5rfr" podStartSLOduration=1.202738173 podStartE2EDuration="2.767258797s" podCreationTimestamp="2026-04-16 20:14:18 +0000 UTC" firstStartedPulling="2026-04-16 20:14:18.921290683 +0000 UTC m=+156.210216446" lastFinishedPulling="2026-04-16 20:14:20.485811307 +0000 UTC m=+157.774737070" observedRunningTime="2026-04-16 20:14:20.765803012 +0000 UTC m=+158.054728798" watchObservedRunningTime="2026-04-16 20:14:20.767258797 +0000 UTC m=+158.056184581" Apr 16 20:14:21.823911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:21.823865 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v8mkb_49451082-0796-4a50-af11-a585eef9af8c/kube-storage-version-migrator-operator/0.log" Apr 16 20:14:22.358256 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:22.358216 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:22.358409 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.358362 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:22.358448 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.358421 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert podName:eee002f1-2d78-4f01-b6c8-7f7b9567f19b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:26.358404565 +0000 UTC m=+163.647330342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nvxzk" (UID: "eee002f1-2d78-4f01-b6c8-7f7b9567f19b") : secret "networking-console-plugin-cert" not found Apr 16 20:14:22.458692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:22.458658 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") pod \"image-registry-58f999fc8b-87hbq\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:14:22.458839 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:22.458697 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:14:22.458839 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.458789 2563 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 20:14:22.458839 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.458819 2563 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 20:14:22.458839 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.458836 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls podName:ed1e1b27-b156-463d-9ee6-eaa33682d57c nodeName:}" failed. No retries permitted until 2026-04-16 20:16:24.458823118 +0000 UTC m=+281.747748882 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls") pod "dns-default-wfhdc" (UID: "ed1e1b27-b156-463d-9ee6-eaa33682d57c") : secret "dns-default-metrics-tls" not found Apr 16 20:14:22.458839 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.458840 2563 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-58f999fc8b-87hbq: secret "image-registry-tls" not found Apr 16 20:14:22.459025 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.458899 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls podName:a36a7e81-30a8-46f8-b55e-9a5b61290032 nodeName:}" failed. No retries permitted until 2026-04-16 20:16:24.45888269 +0000 UTC m=+281.747808458 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls") pod "image-registry-58f999fc8b-87hbq" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032") : secret "image-registry-tls" not found Apr 16 20:14:22.559824 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:22.559794 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:14:22.559979 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.559885 2563 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 20:14:22.559979 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:22.559953 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert podName:d78ddec9-9c5c-40a0-b5b1-d748cb8a110c nodeName:}" failed. No retries permitted until 2026-04-16 20:16:24.559936441 +0000 UTC m=+281.848862209 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert") pod "ingress-canary-qsrv5" (UID: "d78ddec9-9c5c-40a0-b5b1-d748cb8a110c") : secret "canary-serving-cert" not found Apr 16 20:14:26.386478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:26.386443 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:26.386893 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:26.386541 2563 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 20:14:26.386893 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:26.386608 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert podName:eee002f1-2d78-4f01-b6c8-7f7b9567f19b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:34.386594716 +0000 UTC m=+171.675520479 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-nvxzk" (UID: "eee002f1-2d78-4f01-b6c8-7f7b9567f19b") : secret "networking-console-plugin-cert" not found Apr 16 20:14:31.294196 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:31.294164 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:14:31.294554 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:31.294164 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:14:33.295427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:33.295357 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:14:34.440228 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:34.440193 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:34.442494 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:34.442472 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/eee002f1-2d78-4f01-b6c8-7f7b9567f19b-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-nvxzk\" (UID: \"eee002f1-2d78-4f01-b6c8-7f7b9567f19b\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:34.510401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:34.510379 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" Apr 16 20:14:34.631483 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:34.631354 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk"] Apr 16 20:14:34.633724 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:34.633699 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeee002f1_2d78_4f01_b6c8_7f7b9567f19b.slice/crio-7850070dd9bb48adcaa5343174b3c9862a1a7898c04721eaae16ed0acf4e9816 WatchSource:0}: Error finding container 7850070dd9bb48adcaa5343174b3c9862a1a7898c04721eaae16ed0acf4e9816: Status 404 returned error can't find the container with id 7850070dd9bb48adcaa5343174b3c9862a1a7898c04721eaae16ed0acf4e9816 Apr 16 20:14:34.784079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:34.784020 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" event={"ID":"eee002f1-2d78-4f01-b6c8-7f7b9567f19b","Type":"ContainerStarted","Data":"7850070dd9bb48adcaa5343174b3c9862a1a7898c04721eaae16ed0acf4e9816"} Apr 16 20:14:35.788498 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:35.788425 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" event={"ID":"eee002f1-2d78-4f01-b6c8-7f7b9567f19b","Type":"ContainerStarted","Data":"f7bbe675fdca5c0e86df14f0f9eaec93d56acb1d2ae806ac6c370169b5be3ffa"} Apr 16 20:14:35.807313 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:35.807268 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-nvxzk" podStartSLOduration=16.960120072 podStartE2EDuration="17.807254498s" podCreationTimestamp="2026-04-16 20:14:18 +0000 UTC" firstStartedPulling="2026-04-16 20:14:34.635745734 +0000 UTC m=+171.924671512" lastFinishedPulling="2026-04-16 20:14:35.482880175 +0000 UTC m=+172.771805938" observedRunningTime="2026-04-16 20:14:35.807014487 +0000 UTC m=+173.095940273" watchObservedRunningTime="2026-04-16 20:14:35.807254498 +0000 UTC m=+173.096180283" Apr 16 20:14:38.379274 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.379240 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-m64qv"] Apr 16 20:14:38.382389 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.382372 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.395860 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.395843 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 20:14:38.396370 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.396339 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 20:14:38.396465 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.396339 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 20:14:38.396465 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.396347 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 20:14:38.396465 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.396388 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-zvwln\"" Apr 16 20:14:38.426031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.426009 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m64qv"] Apr 16 20:14:38.466540 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.466517 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/77f4261b-6ea9-40a7-84ba-cdb40f512c02-crio-socket\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.466651 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.466585 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/77f4261b-6ea9-40a7-84ba-cdb40f512c02-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.466651 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.466614 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/77f4261b-6ea9-40a7-84ba-cdb40f512c02-data-volume\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.466723 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.466657 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.466757 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.466743 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w4jw\" (UniqueName: \"kubernetes.io/projected/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-api-access-9w4jw\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567092 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567070 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567112 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9w4jw\" (UniqueName: \"kubernetes.io/projected/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-api-access-9w4jw\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567136 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/77f4261b-6ea9-40a7-84ba-cdb40f512c02-crio-socket\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567281 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567209 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/77f4261b-6ea9-40a7-84ba-cdb40f512c02-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567281 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567247 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/77f4261b-6ea9-40a7-84ba-cdb40f512c02-data-volume\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567379 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567367 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/77f4261b-6ea9-40a7-84ba-cdb40f512c02-crio-socket\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567580 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567544 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/77f4261b-6ea9-40a7-84ba-cdb40f512c02-data-volume\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.567682 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.567667 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.569322 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.569302 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/77f4261b-6ea9-40a7-84ba-cdb40f512c02-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.582605 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.582580 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w4jw\" (UniqueName: \"kubernetes.io/projected/77f4261b-6ea9-40a7-84ba-cdb40f512c02-kube-api-access-9w4jw\") pod \"insights-runtime-extractor-m64qv\" (UID: \"77f4261b-6ea9-40a7-84ba-cdb40f512c02\") " pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.690234 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.690212 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-m64qv" Apr 16 20:14:38.808468 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:38.808442 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-m64qv"] Apr 16 20:14:38.812010 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:38.811984 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77f4261b_6ea9_40a7_84ba_cdb40f512c02.slice/crio-4348be911c162f56d9784776587e9d202a5981a86d0ace537eb8fc7f88b65f89 WatchSource:0}: Error finding container 4348be911c162f56d9784776587e9d202a5981a86d0ace537eb8fc7f88b65f89: Status 404 returned error can't find the container with id 4348be911c162f56d9784776587e9d202a5981a86d0ace537eb8fc7f88b65f89 Apr 16 20:14:39.798396 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:39.798328 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m64qv" event={"ID":"77f4261b-6ea9-40a7-84ba-cdb40f512c02","Type":"ContainerStarted","Data":"230c5fe217e93907e65ae4c73697bfa43ca6357c6c656867fa007b31c3ace506"} Apr 16 20:14:39.798396 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:39.798364 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m64qv" event={"ID":"77f4261b-6ea9-40a7-84ba-cdb40f512c02","Type":"ContainerStarted","Data":"1bf9c4ab17f2f07a1b861a29744243e972db4eb7feb4e9c2160d137c5a26e147"} Apr 16 20:14:39.798396 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:39.798374 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m64qv" event={"ID":"77f4261b-6ea9-40a7-84ba-cdb40f512c02","Type":"ContainerStarted","Data":"4348be911c162f56d9784776587e9d202a5981a86d0ace537eb8fc7f88b65f89"} Apr 16 20:14:41.805733 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:41.805699 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-m64qv" event={"ID":"77f4261b-6ea9-40a7-84ba-cdb40f512c02","Type":"ContainerStarted","Data":"3d953cc7a9cc165c68161cce36ddfc7d8727ee6dc33ddf6eb47b009d4e13107c"} Apr 16 20:14:41.838691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:41.838649 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-m64qv" podStartSLOduration=1.740369744 podStartE2EDuration="3.838637424s" podCreationTimestamp="2026-04-16 20:14:38 +0000 UTC" firstStartedPulling="2026-04-16 20:14:38.862078554 +0000 UTC m=+176.151004321" lastFinishedPulling="2026-04-16 20:14:40.960346217 +0000 UTC m=+178.249272001" observedRunningTime="2026-04-16 20:14:41.837867003 +0000 UTC m=+179.126792787" watchObservedRunningTime="2026-04-16 20:14:41.838637424 +0000 UTC m=+179.127563208" Apr 16 20:14:46.729046 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.729008 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57"] Apr 16 20:14:46.732542 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.732525 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:46.735008 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.734987 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 20:14:46.735143 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.735054 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-h5624\"" Apr 16 20:14:46.743034 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.743014 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57"] Apr 16 20:14:46.819428 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.819404 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kq57\" (UID: \"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:46.920401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:46.920375 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kq57\" (UID: \"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:46.920513 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:46.920478 2563 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:14:46.920580 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:46.920524 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates podName:6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b nodeName:}" failed. No retries permitted until 2026-04-16 20:14:47.420511205 +0000 UTC m=+184.709436968 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-9kq57" (UID: "6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b") : secret "prometheus-operator-admission-webhook-tls" not found Apr 16 20:14:47.423203 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:47.423172 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kq57\" (UID: \"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:47.425531 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:47.425512 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-9kq57\" (UID: \"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:47.640952 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:47.640928 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:47.758844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:47.758820 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57"] Apr 16 20:14:47.762531 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:47.762501 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fd62f91_dfe1_46dd_bfd3_4dd5590d3d5b.slice/crio-dbf26fba5fa866dbf8c6d775cb1ca31cba6e81383df97754b3cbd96327c334c7 WatchSource:0}: Error finding container dbf26fba5fa866dbf8c6d775cb1ca31cba6e81383df97754b3cbd96327c334c7: Status 404 returned error can't find the container with id dbf26fba5fa866dbf8c6d775cb1ca31cba6e81383df97754b3cbd96327c334c7 Apr 16 20:14:47.820600 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:47.820570 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" event={"ID":"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b","Type":"ContainerStarted","Data":"dbf26fba5fa866dbf8c6d775cb1ca31cba6e81383df97754b3cbd96327c334c7"} Apr 16 20:14:48.824470 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:48.824444 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" event={"ID":"6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b","Type":"ContainerStarted","Data":"dbc701f4be3e82e2659eb50e8ce7be0cc24200a5024228a727ffca4096c34dec"} Apr 16 20:14:48.824813 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:48.824652 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:48.829851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:48.829832 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" Apr 16 20:14:48.841236 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:48.841191 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-9kq57" podStartSLOduration=1.8664165430000001 podStartE2EDuration="2.841175793s" podCreationTimestamp="2026-04-16 20:14:46 +0000 UTC" firstStartedPulling="2026-04-16 20:14:47.766701916 +0000 UTC m=+185.055627679" lastFinishedPulling="2026-04-16 20:14:48.741461165 +0000 UTC m=+186.030386929" observedRunningTime="2026-04-16 20:14:48.841015625 +0000 UTC m=+186.129941414" watchObservedRunningTime="2026-04-16 20:14:48.841175793 +0000 UTC m=+186.130101582" Apr 16 20:14:55.151693 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.151658 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc"] Apr 16 20:14:55.155182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.155165 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.162524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.162136 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:14:55.162524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.162412 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 20:14:55.162524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.162509 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 20:14:55.162739 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.162536 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 20:14:55.163212 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.163193 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-5r5vg\"" Apr 16 20:14:55.163302 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.163231 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 20:14:55.169416 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.169394 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcstg\" (UniqueName: \"kubernetes.io/projected/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-kube-api-access-rcstg\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.169520 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.169483 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.169613 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.169527 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.169613 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.169603 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.179297 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.179275 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xqpsk"] Apr 16 20:14:55.182092 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.182073 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc"] Apr 16 20:14:55.182231 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.182212 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.184745 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.184724 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 20:14:55.187967 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.187948 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 20:14:55.188966 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.188947 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 20:14:55.191977 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.191962 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-v58q2\"" Apr 16 20:14:55.211278 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.211255 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xqpsk"] Apr 16 20:14:55.213315 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.213296 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-ggss5"] Apr 16 20:14:55.216164 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.216145 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.219069 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.219051 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 20:14:55.219356 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.219334 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q4rxk\"" Apr 16 20:14:55.219766 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.219748 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 20:14:55.220750 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.220732 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 20:14:55.270035 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.269999 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-wtmp\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270053 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270095 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.270138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270134 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-sys\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270159 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-root\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270182 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-metrics-client-ca\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270216 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.270300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270243 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270275 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-accelerators-collector-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270315 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03466f36-98b0-4673-b128-7e1d176ae32d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270339 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj7hl\" (UniqueName: \"kubernetes.io/projected/a7888091-f34c-4d28-a756-9d03c77ffcb3-kube-api-access-hj7hl\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270364 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.270399 2563 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270402 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrlgw\" (UniqueName: \"kubernetes.io/projected/03466f36-98b0-4673-b128-7e1d176ae32d-kube-api-access-qrlgw\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270464 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270543 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.270507 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls podName:e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:55.770467222 +0000 UTC m=+193.059393000 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-x9qlc" (UID: "e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43") : secret "openshift-state-metrics-tls" not found Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270602 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270671 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270698 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270748 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-textfile\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270776 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcstg\" (UniqueName: \"kubernetes.io/projected/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-kube-api-access-rcstg\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.270904 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.270783 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.272983 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.272954 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.295211 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.295188 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcstg\" (UniqueName: \"kubernetes.io/projected/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-kube-api-access-rcstg\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.371048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371022 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-sys\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371054 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-root\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371076 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-metrics-client-ca\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371119 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371127 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-sys\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371149 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-accelerators-collector-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371151 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-root\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371212 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03466f36-98b0-4673-b128-7e1d176ae32d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371241 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj7hl\" (UniqueName: \"kubernetes.io/projected/a7888091-f34c-4d28-a756-9d03c77ffcb3-kube-api-access-hj7hl\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371269 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371298 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qrlgw\" (UniqueName: \"kubernetes.io/projected/03466f36-98b0-4673-b128-7e1d176ae32d-kube-api-access-qrlgw\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371331 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371385 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371411 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371444 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-textfile\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371497 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-wtmp\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371543 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371547 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/03466f36-98b0-4673-b128-7e1d176ae32d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.371693 2563 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371729 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-accelerators-collector-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.371768 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls podName:a7888091-f34c-4d28-a756-9d03c77ffcb3 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:55.871750301 +0000 UTC m=+193.160676069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls") pod "node-exporter-ggss5" (UID: "a7888091-f34c-4d28-a756-9d03c77ffcb3") : secret "node-exporter-tls" not found Apr 16 20:14:55.371875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.371844 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a7888091-f34c-4d28-a756-9d03c77ffcb3-metrics-client-ca\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.372214 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.372084 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-textfile\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.372214 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.372160 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.373031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.372511 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03466f36-98b0-4673-b128-7e1d176ae32d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.373031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.372816 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-wtmp\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.374074 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.374036 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.374670 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.374641 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/03466f36-98b0-4673-b128-7e1d176ae32d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.375594 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.375551 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.389573 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.382291 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj7hl\" (UniqueName: \"kubernetes.io/projected/a7888091-f34c-4d28-a756-9d03c77ffcb3-kube-api-access-hj7hl\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.389573 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.382649 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrlgw\" (UniqueName: \"kubernetes.io/projected/03466f36-98b0-4673-b128-7e1d176ae32d-kube-api-access-qrlgw\") pod \"kube-state-metrics-69db897b98-xqpsk\" (UID: \"03466f36-98b0-4673-b128-7e1d176ae32d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.491286 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.491260 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" Apr 16 20:14:55.623453 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.623430 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-xqpsk"] Apr 16 20:14:55.626118 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:55.626087 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03466f36_98b0_4673_b128_7e1d176ae32d.slice/crio-f8691a89911497cf757a73bc184daaf46c523799ce0ce526f14d252af88475a3 WatchSource:0}: Error finding container f8691a89911497cf757a73bc184daaf46c523799ce0ce526f14d252af88475a3: Status 404 returned error can't find the container with id f8691a89911497cf757a73bc184daaf46c523799ce0ce526f14d252af88475a3 Apr 16 20:14:55.775248 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.775172 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:55.775363 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.775344 2563 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 16 20:14:55.775437 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:14:55.775427 2563 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls podName:e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43 nodeName:}" failed. No retries permitted until 2026-04-16 20:14:56.77540733 +0000 UTC m=+194.064333099 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-x9qlc" (UID: "e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43") : secret "openshift-state-metrics-tls" not found Apr 16 20:14:55.842605 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.842580 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" event={"ID":"03466f36-98b0-4673-b128-7e1d176ae32d","Type":"ContainerStarted","Data":"f8691a89911497cf757a73bc184daaf46c523799ce0ce526f14d252af88475a3"} Apr 16 20:14:55.876135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.876098 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:55.878316 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:55.878297 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/a7888091-f34c-4d28-a756-9d03c77ffcb3-node-exporter-tls\") pod \"node-exporter-ggss5\" (UID: \"a7888091-f34c-4d28-a756-9d03c77ffcb3\") " pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:56.125665 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.125594 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-ggss5" Apr 16 20:14:56.133538 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:56.133513 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7888091_f34c_4d28_a756_9d03c77ffcb3.slice/crio-cff3c356d9eb6fda841d5241f4bcd616995bc3bc072752a1946a29c11e29eeea WatchSource:0}: Error finding container cff3c356d9eb6fda841d5241f4bcd616995bc3bc072752a1946a29c11e29eeea: Status 404 returned error can't find the container with id cff3c356d9eb6fda841d5241f4bcd616995bc3bc072752a1946a29c11e29eeea Apr 16 20:14:56.782742 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.782713 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:56.785447 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.785425 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-x9qlc\" (UID: \"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:56.848297 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.848251 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ggss5" event={"ID":"a7888091-f34c-4d28-a756-9d03c77ffcb3","Type":"ContainerStarted","Data":"cff3c356d9eb6fda841d5241f4bcd616995bc3bc072752a1946a29c11e29eeea"} Apr 16 20:14:56.852404 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.852175 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" event={"ID":"03466f36-98b0-4673-b128-7e1d176ae32d","Type":"ContainerStarted","Data":"a34649d8434f058a1f69fe667d06ef250f56579d4669d75a6ff0ff99b267af34"} Apr 16 20:14:56.965668 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:56.965640 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" Apr 16 20:14:57.211591 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.211541 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-566fc79bc9-s642c"] Apr 16 20:14:57.221178 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.221154 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc"] Apr 16 20:14:57.221296 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.221277 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.224223 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224200 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 20:14:57.224338 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224200 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 20:14:57.224401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224203 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 20:14:57.224620 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224602 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 20:14:57.224929 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224910 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-46135ocikpfiu\"" Apr 16 20:14:57.225028 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.224961 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 20:14:57.225429 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.225409 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-xzhkn\"" Apr 16 20:14:57.226315 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:57.226272 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6c4b40f_dbb0_49dd_ba5d_a94904cf2f43.slice/crio-eaff49cf04cc9af322c21615f5fb0350fc2c4e310b5fd66ccaad6fb4a0919c64 WatchSource:0}: Error finding container eaff49cf04cc9af322c21615f5fb0350fc2c4e310b5fd66ccaad6fb4a0919c64: Status 404 returned error can't find the container with id eaff49cf04cc9af322c21615f5fb0350fc2c4e310b5fd66ccaad6fb4a0919c64 Apr 16 20:14:57.231451 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.231432 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-566fc79bc9-s642c"] Apr 16 20:14:57.287488 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287454 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287600 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287531 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287600 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287582 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwp7c\" (UniqueName: \"kubernetes.io/projected/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-kube-api-access-gwp7c\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287689 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287608 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-grpc-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287689 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287667 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-metrics-client-ca\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287720 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287743 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.287812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.287780 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.388834 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388806 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gwp7c\" (UniqueName: \"kubernetes.io/projected/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-kube-api-access-gwp7c\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.388945 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388839 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-grpc-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.388945 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388860 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-metrics-client-ca\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.388945 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388907 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.388945 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388936 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.389149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.388973 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.389149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.389044 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.389149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.389114 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.389839 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.389809 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-metrics-client-ca\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.391833 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.391810 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.391933 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.391871 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.392342 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.392037 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-grpc-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.392342 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.392305 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.392616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.392596 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.392696 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.392677 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-secret-thanos-querier-tls\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.397010 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.396987 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwp7c\" (UniqueName: \"kubernetes.io/projected/17f468e0-dcb6-4c92-b02e-049ac8f25e1f-kube-api-access-gwp7c\") pod \"thanos-querier-566fc79bc9-s642c\" (UID: \"17f468e0-dcb6-4c92-b02e-049ac8f25e1f\") " pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.533122 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.533089 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:14:57.655868 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.655837 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-566fc79bc9-s642c"] Apr 16 20:14:57.659006 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:14:57.658977 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f468e0_dcb6_4c92_b02e_049ac8f25e1f.slice/crio-4dad2c1eb12653eadf82809073939d7d8a086bf004f0bc817f9415b6bc768c9c WatchSource:0}: Error finding container 4dad2c1eb12653eadf82809073939d7d8a086bf004f0bc817f9415b6bc768c9c: Status 404 returned error can't find the container with id 4dad2c1eb12653eadf82809073939d7d8a086bf004f0bc817f9415b6bc768c9c Apr 16 20:14:57.855955 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.855892 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"4dad2c1eb12653eadf82809073939d7d8a086bf004f0bc817f9415b6bc768c9c"} Apr 16 20:14:57.857388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.857360 2563 generic.go:358] "Generic (PLEG): container finished" podID="a7888091-f34c-4d28-a756-9d03c77ffcb3" containerID="56d4bde7cb8506abd130a5ecca3a364feee238b26f6211180df82489d1fd681b" exitCode=0 Apr 16 20:14:57.857493 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.857456 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ggss5" event={"ID":"a7888091-f34c-4d28-a756-9d03c77ffcb3","Type":"ContainerDied","Data":"56d4bde7cb8506abd130a5ecca3a364feee238b26f6211180df82489d1fd681b"} Apr 16 20:14:57.859845 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.859804 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" event={"ID":"03466f36-98b0-4673-b128-7e1d176ae32d","Type":"ContainerStarted","Data":"efc4af812bd06147ffb70fc1f9b743d144096c77bf2f95d8820705837ca3144b"} Apr 16 20:14:57.859946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.859849 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" event={"ID":"03466f36-98b0-4673-b128-7e1d176ae32d","Type":"ContainerStarted","Data":"c20a1afa3b4f757aeba7cbb9f16fb7345186eac16c5cb1978e6800d044dfb05e"} Apr 16 20:14:57.861655 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.861631 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" event={"ID":"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43","Type":"ContainerStarted","Data":"de1a662dc6d956de7abbfafdd0d604a844fb032c35674c97e11999e3a5378092"} Apr 16 20:14:57.861736 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.861661 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" event={"ID":"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43","Type":"ContainerStarted","Data":"2b6cacefda3ca65097aeac75715b7fdebb54d80a2c56ea4ce1bfcff29c8beb8e"} Apr 16 20:14:57.861736 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.861675 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" event={"ID":"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43","Type":"ContainerStarted","Data":"eaff49cf04cc9af322c21615f5fb0350fc2c4e310b5fd66ccaad6fb4a0919c64"} Apr 16 20:14:57.934139 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:57.934069 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-xqpsk" podStartSLOduration=1.835275299 podStartE2EDuration="2.934054543s" podCreationTimestamp="2026-04-16 20:14:55 +0000 UTC" firstStartedPulling="2026-04-16 20:14:55.628036774 +0000 UTC m=+192.916962547" lastFinishedPulling="2026-04-16 20:14:56.726816025 +0000 UTC m=+194.015741791" observedRunningTime="2026-04-16 20:14:57.932430001 +0000 UTC m=+195.221355786" watchObservedRunningTime="2026-04-16 20:14:57.934054543 +0000 UTC m=+195.222980329" Apr 16 20:14:58.867060 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:58.866981 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" event={"ID":"e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43","Type":"ContainerStarted","Data":"e72a52469b689034f6de336be573e38b2947dbdf2e9dd8e27878b16a6080739a"} Apr 16 20:14:58.869014 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:58.868989 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ggss5" event={"ID":"a7888091-f34c-4d28-a756-9d03c77ffcb3","Type":"ContainerStarted","Data":"9d4fe9bb2fcc7beecb54a2bd72c8612ca788960f71af3c65ea15ecae19f7436c"} Apr 16 20:14:58.869118 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:58.869019 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-ggss5" event={"ID":"a7888091-f34c-4d28-a756-9d03c77ffcb3","Type":"ContainerStarted","Data":"699573b683700a7b53fe4cbc9f41f38015f27e2e13f1aa562ee7c5cf493a8634"} Apr 16 20:14:58.890135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:58.890090 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-x9qlc" podStartSLOduration=2.8241522420000003 podStartE2EDuration="3.890077481s" podCreationTimestamp="2026-04-16 20:14:55 +0000 UTC" firstStartedPulling="2026-04-16 20:14:57.366241561 +0000 UTC m=+194.655167325" lastFinishedPulling="2026-04-16 20:14:58.432166787 +0000 UTC m=+195.721092564" observedRunningTime="2026-04-16 20:14:58.887964904 +0000 UTC m=+196.176890692" watchObservedRunningTime="2026-04-16 20:14:58.890077481 +0000 UTC m=+196.179003266" Apr 16 20:14:58.924942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:58.924903 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-ggss5" podStartSLOduration=2.920476688 podStartE2EDuration="3.924888555s" podCreationTimestamp="2026-04-16 20:14:55 +0000 UTC" firstStartedPulling="2026-04-16 20:14:56.135454531 +0000 UTC m=+193.424380300" lastFinishedPulling="2026-04-16 20:14:57.139866388 +0000 UTC m=+194.428792167" observedRunningTime="2026-04-16 20:14:58.923126058 +0000 UTC m=+196.212051856" watchObservedRunningTime="2026-04-16 20:14:58.924888555 +0000 UTC m=+196.213814341" Apr 16 20:14:59.575248 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.575184 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-74f849f78c-cnf9b"] Apr 16 20:14:59.578344 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.578327 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.589215 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589194 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1n2o63s4bl9rq\"" Apr 16 20:14:59.589696 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589683 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-m2xk2\"" Apr 16 20:14:59.589793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589713 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 20:14:59.589793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589716 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 20:14:59.589793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589730 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 20:14:59.589793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.589749 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 20:14:59.600887 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.600868 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74f849f78c-cnf9b"] Apr 16 20:14:59.609805 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609786 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.609894 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609811 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aee138ef-22a8-4713-af50-26f151c86fe4-audit-log\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.609894 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609831 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-metrics-server-audit-profiles\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.609959 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609901 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-tls\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.609959 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609934 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-client-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.609959 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609951 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs2sk\" (UniqueName: \"kubernetes.io/projected/aee138ef-22a8-4713-af50-26f151c86fe4-kube-api-access-zs2sk\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.610045 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.609978 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-client-certs\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711204 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711183 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-client-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711288 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711209 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zs2sk\" (UniqueName: \"kubernetes.io/projected/aee138ef-22a8-4713-af50-26f151c86fe4-kube-api-access-zs2sk\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711288 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711232 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-client-certs\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711291 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711307 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aee138ef-22a8-4713-af50-26f151c86fe4-audit-log\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711323 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-metrics-server-audit-profiles\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711356 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-tls\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.711759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.711725 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/aee138ef-22a8-4713-af50-26f151c86fe4-audit-log\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.712032 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.712001 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.712226 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.712208 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/aee138ef-22a8-4713-af50-26f151c86fe4-metrics-server-audit-profiles\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.713822 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.713796 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-client-certs\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.713917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.713848 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-secret-metrics-server-tls\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.713917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.713888 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aee138ef-22a8-4713-af50-26f151c86fe4-client-ca-bundle\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.722472 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.722446 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs2sk\" (UniqueName: \"kubernetes.io/projected/aee138ef-22a8-4713-af50-26f151c86fe4-kube-api-access-zs2sk\") pod \"metrics-server-74f849f78c-cnf9b\" (UID: \"aee138ef-22a8-4713-af50-26f151c86fe4\") " pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.874920 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.874849 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"60e9f616b482c93417f794946e174ecf53164fb92b1e06f92c530a7c3d58b049"} Apr 16 20:14:59.875385 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.875347 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"93ea965720e8b4a51ad574d8c74986d418eff0f1aa23f2d2f15c1d9dc0b92743"} Apr 16 20:14:59.875385 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.875379 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"f6509c3b9bf55bc24f97148993b245e12410b0c9ec52fa061799a192175dc9dd"} Apr 16 20:14:59.886482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.886458 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:14:59.959917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.959890 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c"] Apr 16 20:14:59.964260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.964239 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:14:59.967713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.967628 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 20:14:59.967774 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.967718 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-jmnzd\"" Apr 16 20:14:59.986388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:14:59.986361 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c"] Apr 16 20:15:00.014455 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.014424 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39ac0b69-dd53-4c7e-92bc-df176b8bd42e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jxh4c\" (UID: \"39ac0b69-dd53-4c7e-92bc-df176b8bd42e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:00.042752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.042726 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-74f849f78c-cnf9b"] Apr 16 20:15:00.046875 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:15:00.046765 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee138ef_22a8_4713_af50_26f151c86fe4.slice/crio-49b40c1afb74913b611b6126771dbecbb0298488db9e3e0a7407e8a9cc7d50b6 WatchSource:0}: Error finding container 49b40c1afb74913b611b6126771dbecbb0298488db9e3e0a7407e8a9cc7d50b6: Status 404 returned error can't find the container with id 49b40c1afb74913b611b6126771dbecbb0298488db9e3e0a7407e8a9cc7d50b6 Apr 16 20:15:00.115731 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.115644 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39ac0b69-dd53-4c7e-92bc-df176b8bd42e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jxh4c\" (UID: \"39ac0b69-dd53-4c7e-92bc-df176b8bd42e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:00.118760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.118739 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/39ac0b69-dd53-4c7e-92bc-df176b8bd42e-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-jxh4c\" (UID: \"39ac0b69-dd53-4c7e-92bc-df176b8bd42e\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:00.277638 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.277605 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:00.303518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.303491 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58f999fc8b-87hbq"] Apr 16 20:15:00.303772 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:15:00.303750 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" podUID="a36a7e81-30a8-46f8-b55e-9a5b61290032" Apr 16 20:15:00.401513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.401488 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c"] Apr 16 20:15:00.403514 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:15:00.403484 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39ac0b69_dd53_4c7e_92bc_df176b8bd42e.slice/crio-1e85a64fa7e855b37b719695537a7b0f27e8ad30942c33420caa13043bdbbce9 WatchSource:0}: Error finding container 1e85a64fa7e855b37b719695537a7b0f27e8ad30942c33420caa13043bdbbce9: Status 404 returned error can't find the container with id 1e85a64fa7e855b37b719695537a7b0f27e8ad30942c33420caa13043bdbbce9 Apr 16 20:15:00.881533 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.881493 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"0af1d9cf6d0bbc1bde65143f19c22e24c7426d2767895b3e96632832216693e1"} Apr 16 20:15:00.881533 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.881532 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"52b7d2dba87506a34c3876390cb46917cfc6f96bc807e2d8f851503797785283"} Apr 16 20:15:00.882047 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.881546 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" event={"ID":"17f468e0-dcb6-4c92-b02e-049ac8f25e1f","Type":"ContainerStarted","Data":"4cf39c4703d729d97fc9ade56a36b02ecd88cf155c111ffddc2243474713ece5"} Apr 16 20:15:00.882047 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.881727 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:15:00.883156 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.883127 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" event={"ID":"aee138ef-22a8-4713-af50-26f151c86fe4","Type":"ContainerStarted","Data":"49b40c1afb74913b611b6126771dbecbb0298488db9e3e0a7407e8a9cc7d50b6"} Apr 16 20:15:00.885279 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.884826 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:15:00.885279 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.885159 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" event={"ID":"39ac0b69-dd53-4c7e-92bc-df176b8bd42e","Type":"ContainerStarted","Data":"1e85a64fa7e855b37b719695537a7b0f27e8ad30942c33420caa13043bdbbce9"} Apr 16 20:15:00.890091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.890068 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:15:00.907975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.907887 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" podStartSLOduration=1.236719159 podStartE2EDuration="3.907871787s" podCreationTimestamp="2026-04-16 20:14:57 +0000 UTC" firstStartedPulling="2026-04-16 20:14:57.660776839 +0000 UTC m=+194.949702607" lastFinishedPulling="2026-04-16 20:15:00.331929468 +0000 UTC m=+197.620855235" observedRunningTime="2026-04-16 20:15:00.906613705 +0000 UTC m=+198.195539492" watchObservedRunningTime="2026-04-16 20:15:00.907871787 +0000 UTC m=+198.196797576" Apr 16 20:15:00.921481 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921433 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921598 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921508 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921598 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921552 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921604 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921633 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gpgl\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921661 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921686 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca\") pod \"a36a7e81-30a8-46f8-b55e-9a5b61290032\" (UID: \"a36a7e81-30a8-46f8-b55e-9a5b61290032\") " Apr 16 20:15:00.921949 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.921784 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:15:00.922467 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.922089 2563 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a36a7e81-30a8-46f8-b55e-9a5b61290032-ca-trust-extracted\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:00.922467 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.922114 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:00.922467 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.922159 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:15:00.924719 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.924656 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:00.924719 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.924664 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:00.925287 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.925255 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:15:00.925497 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:00.925470 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl" (OuterVolumeSpecName: "kube-api-access-2gpgl") pod "a36a7e81-30a8-46f8-b55e-9a5b61290032" (UID: "a36a7e81-30a8-46f8-b55e-9a5b61290032"). InnerVolumeSpecName "kube-api-access-2gpgl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022687 2563 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-installation-pull-secrets\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022719 2563 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-bound-sa-token\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022731 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2gpgl\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-kube-api-access-2gpgl\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022745 2563 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-certificates\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022760 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a36a7e81-30a8-46f8-b55e-9a5b61290032-trusted-ca\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.022801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.022776 2563 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/a36a7e81-30a8-46f8-b55e-9a5b61290032-image-registry-private-configuration\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:01.889603 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.889550 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" event={"ID":"aee138ef-22a8-4713-af50-26f151c86fe4","Type":"ContainerStarted","Data":"573a1e33761a3a77fcd6cc9ed28d8c0d4ee25c7052abdc49b491495a03242ab1"} Apr 16 20:15:01.890963 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.890929 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" event={"ID":"39ac0b69-dd53-4c7e-92bc-df176b8bd42e","Type":"ContainerStarted","Data":"61341e935975301046ee649b437d414308092e0e7f302b50ab617dcff97fde7b"} Apr 16 20:15:01.891077 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.891057 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-58f999fc8b-87hbq" Apr 16 20:15:01.891185 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.891166 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:01.896741 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.896721 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" Apr 16 20:15:01.912326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.912283 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" podStartSLOduration=1.231787918 podStartE2EDuration="2.91227265s" podCreationTimestamp="2026-04-16 20:14:59 +0000 UTC" firstStartedPulling="2026-04-16 20:15:00.049059996 +0000 UTC m=+197.337985774" lastFinishedPulling="2026-04-16 20:15:01.729544741 +0000 UTC m=+199.018470506" observedRunningTime="2026-04-16 20:15:01.911746588 +0000 UTC m=+199.200672375" watchObservedRunningTime="2026-04-16 20:15:01.91227265 +0000 UTC m=+199.201198435" Apr 16 20:15:01.961552 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.961529 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-58f999fc8b-87hbq"] Apr 16 20:15:01.973522 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.973503 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-58f999fc8b-87hbq"] Apr 16 20:15:01.998791 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:01.998712 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-jxh4c" podStartSLOduration=1.6716719389999999 podStartE2EDuration="2.998699102s" podCreationTimestamp="2026-04-16 20:14:59 +0000 UTC" firstStartedPulling="2026-04-16 20:15:00.405588598 +0000 UTC m=+197.694514360" lastFinishedPulling="2026-04-16 20:15:01.732615757 +0000 UTC m=+199.021541523" observedRunningTime="2026-04-16 20:15:01.9962942 +0000 UTC m=+199.285219982" watchObservedRunningTime="2026-04-16 20:15:01.998699102 +0000 UTC m=+199.287624887" Apr 16 20:15:02.032660 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:02.032638 2563 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a36a7e81-30a8-46f8-b55e-9a5b61290032-registry-tls\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:15:03.298214 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:03.298185 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a36a7e81-30a8-46f8-b55e-9a5b61290032" path="/var/lib/kubelet/pods/a36a7e81-30a8-46f8-b55e-9a5b61290032/volumes" Apr 16 20:15:06.896702 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:06.896676 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-566fc79bc9-s642c" Apr 16 20:15:16.639322 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.639279 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:15:16.645431 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.645413 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.651380 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.651345 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 20:15:16.651504 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.651416 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 20:15:16.651818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.651777 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 20:15:16.651910 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.651844 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 20:15:16.653103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.652885 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 20:15:16.653103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.652998 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 20:15:16.653262 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.653241 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-926v5\"" Apr 16 20:15:16.654267 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.654247 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 20:15:16.655637 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.655614 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:15:16.661857 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.661831 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 20:15:16.734120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734094 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mbc6\" (UniqueName: \"kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734146 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734265 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734224 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734303 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734270 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734303 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734294 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734376 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734331 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.734409 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.734392 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835333 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835311 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835345 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mbc6\" (UniqueName: \"kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835378 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835430 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835533 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835500 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835618 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835533 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.835618 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.835555 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.836164 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.836127 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.836262 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.836241 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.836327 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.836241 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.836327 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.836314 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.837919 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.837893 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.838019 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.837991 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.844334 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.844312 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mbc6\" (UniqueName: \"kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6\") pod \"console-7b5447d78-kcv6g\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:16.955457 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:16.955432 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:17.110458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:17.110429 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:15:17.113872 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:15:17.113835 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f3bbb0f_20f7_4da0_90c5_42e44f1b68a8.slice/crio-7d15a44b15e32d611712f3208bea4ef7f91116416f9a5e91788b4d91b8d478ec WatchSource:0}: Error finding container 7d15a44b15e32d611712f3208bea4ef7f91116416f9a5e91788b4d91b8d478ec: Status 404 returned error can't find the container with id 7d15a44b15e32d611712f3208bea4ef7f91116416f9a5e91788b4d91b8d478ec Apr 16 20:15:17.934100 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:17.934065 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5447d78-kcv6g" event={"ID":"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8","Type":"ContainerStarted","Data":"7d15a44b15e32d611712f3208bea4ef7f91116416f9a5e91788b4d91b8d478ec"} Apr 16 20:15:19.887317 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:19.887242 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:15:19.887317 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:19.887283 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:15:19.940279 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:19.940249 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5447d78-kcv6g" event={"ID":"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8","Type":"ContainerStarted","Data":"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b"} Apr 16 20:15:19.960090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:19.960005 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b5447d78-kcv6g" podStartSLOduration=1.489714755 podStartE2EDuration="3.959988555s" podCreationTimestamp="2026-04-16 20:15:16 +0000 UTC" firstStartedPulling="2026-04-16 20:15:17.115626107 +0000 UTC m=+214.404551872" lastFinishedPulling="2026-04-16 20:15:19.585899898 +0000 UTC m=+216.874825672" observedRunningTime="2026-04-16 20:15:19.959762859 +0000 UTC m=+217.248688646" watchObservedRunningTime="2026-04-16 20:15:19.959988555 +0000 UTC m=+217.248914341" Apr 16 20:15:25.962531 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:25.962496 2563 generic.go:358] "Generic (PLEG): container finished" podID="49451082-0796-4a50-af11-a585eef9af8c" containerID="52d21edc02df6cbfb1c1eec8ac13d62d7c8352f776beb05e7fde1c30a49d8e50" exitCode=0 Apr 16 20:15:25.962923 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:25.962589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" event={"ID":"49451082-0796-4a50-af11-a585eef9af8c","Type":"ContainerDied","Data":"52d21edc02df6cbfb1c1eec8ac13d62d7c8352f776beb05e7fde1c30a49d8e50"} Apr 16 20:15:25.962923 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:25.962909 2563 scope.go:117] "RemoveContainer" containerID="52d21edc02df6cbfb1c1eec8ac13d62d7c8352f776beb05e7fde1c30a49d8e50" Apr 16 20:15:26.955962 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:26.955935 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:26.956152 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:26.956023 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:26.960481 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:26.960464 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:26.967210 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:26.967191 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-v8mkb" event={"ID":"49451082-0796-4a50-af11-a585eef9af8c","Type":"ContainerStarted","Data":"e623a44bb6c531c110be0e1dec56bc14af45be5b6de91fd760694dc78304fb97"} Apr 16 20:15:26.970719 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:26.970701 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:15:39.891837 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:39.891805 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:15:39.895634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:39.895612 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-74f849f78c-cnf9b" Apr 16 20:15:55.126230 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:55.126195 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:15:55.128421 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:55.128402 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/51782696-d22a-4882-9ad3-4de29c66583c-metrics-certs\") pod \"network-metrics-daemon-mx2qh\" (UID: \"51782696-d22a-4882-9ad3-4de29c66583c\") " pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:15:55.297256 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:55.297232 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-q2prt\"" Apr 16 20:15:55.304614 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:55.304598 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mx2qh" Apr 16 20:15:55.420187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:55.420156 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mx2qh"] Apr 16 20:15:55.423021 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:15:55.422987 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51782696_d22a_4882_9ad3_4de29c66583c.slice/crio-16ac2864b224481213c86225cd4d6a51af03da52c21a67728e2972df2acc42c4 WatchSource:0}: Error finding container 16ac2864b224481213c86225cd4d6a51af03da52c21a67728e2972df2acc42c4: Status 404 returned error can't find the container with id 16ac2864b224481213c86225cd4d6a51af03da52c21a67728e2972df2acc42c4 Apr 16 20:15:56.045807 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:56.045772 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mx2qh" event={"ID":"51782696-d22a-4882-9ad3-4de29c66583c","Type":"ContainerStarted","Data":"16ac2864b224481213c86225cd4d6a51af03da52c21a67728e2972df2acc42c4"} Apr 16 20:15:57.049616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:57.049578 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mx2qh" event={"ID":"51782696-d22a-4882-9ad3-4de29c66583c","Type":"ContainerStarted","Data":"78656d383b9160a0530355de778d386514a3ff364b88f862dc38174eb035e1d5"} Apr 16 20:15:57.049616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:57.049618 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mx2qh" event={"ID":"51782696-d22a-4882-9ad3-4de29c66583c","Type":"ContainerStarted","Data":"4f15a17f2a176c11e20b80dc0a2559bb69fa936701732a83f53564568973f26c"} Apr 16 20:15:57.069657 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:15:57.069613 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mx2qh" podStartSLOduration=253.185674919 podStartE2EDuration="4m14.069600717s" podCreationTimestamp="2026-04-16 20:11:43 +0000 UTC" firstStartedPulling="2026-04-16 20:15:55.424769125 +0000 UTC m=+252.713694896" lastFinishedPulling="2026-04-16 20:15:56.308694922 +0000 UTC m=+253.597620694" observedRunningTime="2026-04-16 20:15:57.06723319 +0000 UTC m=+254.356158974" watchObservedRunningTime="2026-04-16 20:15:57.069600717 +0000 UTC m=+254.358526502" Apr 16 20:16:19.164980 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.164902 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:16:19.168047 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.168029 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.176771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.176746 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:16:19.208993 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.208968 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqmx4\" (UniqueName: \"kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209092 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209009 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209088 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209190 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209151 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209190 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209183 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209201 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.209260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.209219 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310410 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310382 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310526 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310440 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310526 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310490 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310525 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310554 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310597 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.310658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.310634 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqmx4\" (UniqueName: \"kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.311218 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.311197 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.311307 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.311256 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.311406 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.311383 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.311466 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.311449 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.313008 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.312985 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.313227 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.313204 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.318844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.318825 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqmx4\" (UniqueName: \"kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4\") pod \"console-c97ddb46f-zdfxp\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.477295 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.477267 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:19.595713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:19.595686 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:16:19.598946 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:16:19.598907 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb85e1fb_db90_4d76_9f4a_58eddacb2260.slice/crio-e7f36ec5632451c9099955a1fa163fd89bf033fba5af721a7ca169517bad0e85 WatchSource:0}: Error finding container e7f36ec5632451c9099955a1fa163fd89bf033fba5af721a7ca169517bad0e85: Status 404 returned error can't find the container with id e7f36ec5632451c9099955a1fa163fd89bf033fba5af721a7ca169517bad0e85 Apr 16 20:16:20.121033 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:20.120994 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c97ddb46f-zdfxp" event={"ID":"db85e1fb-db90-4d76-9f4a-58eddacb2260","Type":"ContainerStarted","Data":"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea"} Apr 16 20:16:20.121033 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:20.121036 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c97ddb46f-zdfxp" event={"ID":"db85e1fb-db90-4d76-9f4a-58eddacb2260","Type":"ContainerStarted","Data":"e7f36ec5632451c9099955a1fa163fd89bf033fba5af721a7ca169517bad0e85"} Apr 16 20:16:20.156364 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:20.156312 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c97ddb46f-zdfxp" podStartSLOduration=1.156295966 podStartE2EDuration="1.156295966s" podCreationTimestamp="2026-04-16 20:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:16:20.154175513 +0000 UTC m=+277.443101298" watchObservedRunningTime="2026-04-16 20:16:20.156295966 +0000 UTC m=+277.445221751" Apr 16 20:16:20.742820 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:16:20.742783 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-wfhdc" podUID="ed1e1b27-b156-463d-9ee6-eaa33682d57c" Apr 16 20:16:21.123549 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:21.123478 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:24.548805 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.548765 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:24.551189 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.551163 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed1e1b27-b156-463d-9ee6-eaa33682d57c-metrics-tls\") pod \"dns-default-wfhdc\" (UID: \"ed1e1b27-b156-463d-9ee6-eaa33682d57c\") " pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:24.649988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.649959 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:16:24.652128 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.652110 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d78ddec9-9c5c-40a0-b5b1-d748cb8a110c-cert\") pod \"ingress-canary-qsrv5\" (UID: \"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c\") " pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:16:24.726810 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.726783 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-dstbq\"" Apr 16 20:16:24.735100 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.735081 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:24.849897 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.849865 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wfhdc"] Apr 16 20:16:24.852697 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:16:24.852671 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded1e1b27_b156_463d_9ee6_eaa33682d57c.slice/crio-aebe84145b8df6c47d979ed35ad0a0b59d93b24400f988e74c8fe40e8a4347bb WatchSource:0}: Error finding container aebe84145b8df6c47d979ed35ad0a0b59d93b24400f988e74c8fe40e8a4347bb: Status 404 returned error can't find the container with id aebe84145b8df6c47d979ed35ad0a0b59d93b24400f988e74c8fe40e8a4347bb Apr 16 20:16:24.898415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.898394 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-jqqmg\"" Apr 16 20:16:24.906724 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:24.906695 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-qsrv5" Apr 16 20:16:25.029432 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:25.029405 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-qsrv5"] Apr 16 20:16:25.033003 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:16:25.032980 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78ddec9_9c5c_40a0_b5b1_d748cb8a110c.slice/crio-1ab13dfe52450d4170f277cdea53bb065bcf10a90893f6080dc6f32f0f04dd1a WatchSource:0}: Error finding container 1ab13dfe52450d4170f277cdea53bb065bcf10a90893f6080dc6f32f0f04dd1a: Status 404 returned error can't find the container with id 1ab13dfe52450d4170f277cdea53bb065bcf10a90893f6080dc6f32f0f04dd1a Apr 16 20:16:25.134177 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:25.134107 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsrv5" event={"ID":"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c","Type":"ContainerStarted","Data":"1ab13dfe52450d4170f277cdea53bb065bcf10a90893f6080dc6f32f0f04dd1a"} Apr 16 20:16:25.135105 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:25.135074 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfhdc" event={"ID":"ed1e1b27-b156-463d-9ee6-eaa33682d57c","Type":"ContainerStarted","Data":"aebe84145b8df6c47d979ed35ad0a0b59d93b24400f988e74c8fe40e8a4347bb"} Apr 16 20:16:27.145745 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:27.145659 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-qsrv5" event={"ID":"d78ddec9-9c5c-40a0-b5b1-d748cb8a110c","Type":"ContainerStarted","Data":"9986e9dab0e11ef76e426c5a4d974fb8c6002c7e1dda6404ff896903a6ece3da"} Apr 16 20:16:27.148374 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:27.148337 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfhdc" event={"ID":"ed1e1b27-b156-463d-9ee6-eaa33682d57c","Type":"ContainerStarted","Data":"6a491edc68a1b723313a7cdcf6b281be9482e4e2c15ab293dc5c568c704cbffd"} Apr 16 20:16:27.164736 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:27.164677 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-qsrv5" podStartSLOduration=251.209411851 podStartE2EDuration="4m13.16466048s" podCreationTimestamp="2026-04-16 20:12:14 +0000 UTC" firstStartedPulling="2026-04-16 20:16:25.034861521 +0000 UTC m=+282.323787289" lastFinishedPulling="2026-04-16 20:16:26.990110155 +0000 UTC m=+284.279035918" observedRunningTime="2026-04-16 20:16:27.163319409 +0000 UTC m=+284.452245207" watchObservedRunningTime="2026-04-16 20:16:27.16466048 +0000 UTC m=+284.453586267" Apr 16 20:16:28.155540 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:28.155493 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wfhdc" event={"ID":"ed1e1b27-b156-463d-9ee6-eaa33682d57c","Type":"ContainerStarted","Data":"8fadcf3705b31db6b59630f327dc3f63ca60c43d811ebe4956cb07461230b839"} Apr 16 20:16:28.179014 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:28.178972 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wfhdc" podStartSLOduration=252.046713003 podStartE2EDuration="4m14.178957464s" podCreationTimestamp="2026-04-16 20:12:14 +0000 UTC" firstStartedPulling="2026-04-16 20:16:24.854270719 +0000 UTC m=+282.143196486" lastFinishedPulling="2026-04-16 20:16:26.986515178 +0000 UTC m=+284.275440947" observedRunningTime="2026-04-16 20:16:28.178413774 +0000 UTC m=+285.467339562" watchObservedRunningTime="2026-04-16 20:16:28.178957464 +0000 UTC m=+285.467883295" Apr 16 20:16:29.158673 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:29.158637 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:29.477673 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:29.477638 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:29.477844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:29.477706 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:29.482108 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:29.482084 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:30.164677 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:30.164650 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:16:30.216432 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:30.216399 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:16:39.163378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:39.163348 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wfhdc" Apr 16 20:16:43.174809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:43.174783 2563 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 20:16:55.238521 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.238465 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7b5447d78-kcv6g" podUID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" containerName="console" containerID="cri-o://1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b" gracePeriod=15 Apr 16 20:16:55.472765 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.472744 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5447d78-kcv6g_0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8/console/0.log" Apr 16 20:16:55.472910 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.472820 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:16:55.569008 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.568941 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mbc6\" (UniqueName: \"kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569008 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.568992 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569027 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569061 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569100 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569143 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569181 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert\") pod \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\" (UID: \"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8\") " Apr 16 20:16:55.569546 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569478 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config" (OuterVolumeSpecName: "console-config") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:55.569546 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569521 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca" (OuterVolumeSpecName: "service-ca") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:55.569546 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569519 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:55.569700 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.569579 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:16:55.571230 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.571204 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6" (OuterVolumeSpecName: "kube-api-access-4mbc6") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "kube-api-access-4mbc6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:16:55.571330 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.571243 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:55.571330 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.571297 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" (UID: "0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:16:55.670004 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.669978 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-trusted-ca-bundle\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670004 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670001 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-serving-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670017 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mbc6\" (UniqueName: \"kubernetes.io/projected/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-kube-api-access-4mbc6\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670030 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-oauth-serving-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670045 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-config\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670058 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-console-oauth-config\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:55.670138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:55.670071 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8-service-ca\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:16:56.233029 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233006 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7b5447d78-kcv6g_0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8/console/0.log" Apr 16 20:16:56.233185 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233044 2563 generic.go:358] "Generic (PLEG): container finished" podID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" containerID="1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b" exitCode=2 Apr 16 20:16:56.233185 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233131 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5447d78-kcv6g" Apr 16 20:16:56.233185 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233142 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5447d78-kcv6g" event={"ID":"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8","Type":"ContainerDied","Data":"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b"} Apr 16 20:16:56.233346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233199 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5447d78-kcv6g" event={"ID":"0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8","Type":"ContainerDied","Data":"7d15a44b15e32d611712f3208bea4ef7f91116416f9a5e91788b4d91b8d478ec"} Apr 16 20:16:56.233346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.233216 2563 scope.go:117] "RemoveContainer" containerID="1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b" Apr 16 20:16:56.241808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.241701 2563 scope.go:117] "RemoveContainer" containerID="1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b" Apr 16 20:16:56.242031 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:16:56.241971 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b\": container with ID starting with 1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b not found: ID does not exist" containerID="1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b" Apr 16 20:16:56.242031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.241997 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b"} err="failed to get container status \"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b\": rpc error: code = NotFound desc = could not find container \"1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b\": container with ID starting with 1bf7ce0d116d350976f004354a48dda7ffec4ee657a5f3468be45ef9bf35553b not found: ID does not exist" Apr 16 20:16:56.258898 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.257382 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:16:56.260011 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:56.259972 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7b5447d78-kcv6g"] Apr 16 20:16:57.298516 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:16:57.298487 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" path="/var/lib/kubelet/pods/0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8/volumes" Apr 16 20:17:06.230710 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.230675 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q"] Apr 16 20:17:06.231079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.230970 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" containerName="console" Apr 16 20:17:06.231079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.230983 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" containerName="console" Apr 16 20:17:06.231079 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.231036 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f3bbb0f-20f7-4da0-90c5-42e44f1b68a8" containerName="console" Apr 16 20:17:06.233637 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.233619 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.237708 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.237686 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 20:17:06.237837 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.237707 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 20:17:06.237837 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.237791 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-x844r\"" Apr 16 20:17:06.250656 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.250631 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q"] Apr 16 20:17:06.347569 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.347536 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jdfl\" (UniqueName: \"kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.347717 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.347645 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.347717 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.347669 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.448135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.448107 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.448264 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.448141 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.448264 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.448176 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jdfl\" (UniqueName: \"kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.448488 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.448467 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.448594 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.448510 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.456423 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.456404 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jdfl\" (UniqueName: \"kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.542021 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.541953 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:06.661854 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.661834 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q"] Apr 16 20:17:06.664575 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:17:06.664537 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8596bb3_b088_4a5f_aeee_370a7f8de038.slice/crio-ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f WatchSource:0}: Error finding container ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f: Status 404 returned error can't find the container with id ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f Apr 16 20:17:06.666273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:06.666258 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:17:07.266535 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:07.266492 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" event={"ID":"c8596bb3-b088-4a5f-aeee-370a7f8de038","Type":"ContainerStarted","Data":"ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f"} Apr 16 20:17:12.283121 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:12.283088 2563 generic.go:358] "Generic (PLEG): container finished" podID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerID="0becc1c9cddf4ae83b888e476367c0cf4ded6e7e105ae273798ab31ef275d1ea" exitCode=0 Apr 16 20:17:12.283456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:12.283160 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" event={"ID":"c8596bb3-b088-4a5f-aeee-370a7f8de038","Type":"ContainerDied","Data":"0becc1c9cddf4ae83b888e476367c0cf4ded6e7e105ae273798ab31ef275d1ea"} Apr 16 20:17:14.290486 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:14.290449 2563 generic.go:358] "Generic (PLEG): container finished" podID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerID="1230e62cbba14c8785ce2f65c27e4921c0d6548d26a708689a0b25c2bfcb83c9" exitCode=0 Apr 16 20:17:14.290887 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:14.290505 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" event={"ID":"c8596bb3-b088-4a5f-aeee-370a7f8de038","Type":"ContainerDied","Data":"1230e62cbba14c8785ce2f65c27e4921c0d6548d26a708689a0b25c2bfcb83c9"} Apr 16 20:17:20.202320 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.202285 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh"] Apr 16 20:17:20.205468 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.205450 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.208503 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.208485 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 20:17:20.209580 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.209538 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-tzqkg\"" Apr 16 20:17:20.209686 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.209580 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 20:17:20.209750 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.209689 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 20:17:20.210015 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.209995 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 20:17:20.218467 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.218436 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh"] Apr 16 20:17:20.311195 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.311166 2563 generic.go:358] "Generic (PLEG): container finished" podID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerID="cb370003fe8c9bf2873b767f97fdaaf5a1625abb8c6ebdb2be893218abed4ec0" exitCode=0 Apr 16 20:17:20.311348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.311202 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" event={"ID":"c8596bb3-b088-4a5f-aeee-370a7f8de038","Type":"ContainerDied","Data":"cb370003fe8c9bf2873b767f97fdaaf5a1625abb8c6ebdb2be893218abed4ec0"} Apr 16 20:17:20.366946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.366916 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzvsv\" (UniqueName: \"kubernetes.io/projected/d3b72025-8484-435b-923d-37fe11906d87-kube-api-access-gzvsv\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.367099 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.366964 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d3b72025-8484-435b-923d-37fe11906d87-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.468031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.467949 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gzvsv\" (UniqueName: \"kubernetes.io/projected/d3b72025-8484-435b-923d-37fe11906d87-kube-api-access-gzvsv\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.468031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.468003 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d3b72025-8484-435b-923d-37fe11906d87-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.470391 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.470361 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/d3b72025-8484-435b-923d-37fe11906d87-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.477633 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.477608 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzvsv\" (UniqueName: \"kubernetes.io/projected/d3b72025-8484-435b-923d-37fe11906d87-kube-api-access-gzvsv\") pod \"managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh\" (UID: \"d3b72025-8484-435b-923d-37fe11906d87\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.490878 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.490859 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h"] Apr 16 20:17:20.494415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.494399 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.497528 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.497511 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 20:17:20.497613 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.497533 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 20:17:20.497880 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.497863 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 20:17:20.497990 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.497977 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 20:17:20.507749 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.507730 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h"] Apr 16 20:17:20.532851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.532829 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" Apr 16 20:17:20.651172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.651100 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh"] Apr 16 20:17:20.653234 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:17:20.653204 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3b72025_8484_435b_923d_37fe11906d87.slice/crio-d2d6479da3bb18a0c976c760627a515b4fa500205d4f5544fac6d50f220cdc33 WatchSource:0}: Error finding container d2d6479da3bb18a0c976c760627a515b4fa500205d4f5544fac6d50f220cdc33: Status 404 returned error can't find the container with id d2d6479da3bb18a0c976c760627a515b4fa500205d4f5544fac6d50f220cdc33 Apr 16 20:17:20.669844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.669821 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.669946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.669856 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.669946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.669882 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.670050 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.669948 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.670050 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.669976 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5qmc\" (UniqueName: \"kubernetes.io/projected/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-kube-api-access-k5qmc\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.670145 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.670059 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771086 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771026 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771086 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771064 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771089 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771114 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771144 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5qmc\" (UniqueName: \"kubernetes.io/projected/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-kube-api-access-k5qmc\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771195 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.771850 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.771824 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.773644 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.773624 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-ca\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.773728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.773690 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.773823 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.773806 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.773922 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.773905 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.779004 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.778983 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5qmc\" (UniqueName: \"kubernetes.io/projected/b51b910c-b6f1-4c2b-ac6a-8ae576878e74-kube-api-access-k5qmc\") pod \"cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h\" (UID: \"b51b910c-b6f1-4c2b-ac6a-8ae576878e74\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.803543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.803521 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" Apr 16 20:17:20.919303 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:20.919210 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h"] Apr 16 20:17:20.921637 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:17:20.921606 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb51b910c_b6f1_4c2b_ac6a_8ae576878e74.slice/crio-e0ce973198b9fcf106c7b347603583e06704a01aef160be0aed4bb6e545137c7 WatchSource:0}: Error finding container e0ce973198b9fcf106c7b347603583e06704a01aef160be0aed4bb6e545137c7: Status 404 returned error can't find the container with id e0ce973198b9fcf106c7b347603583e06704a01aef160be0aed4bb6e545137c7 Apr 16 20:17:21.314411 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.314375 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" event={"ID":"b51b910c-b6f1-4c2b-ac6a-8ae576878e74","Type":"ContainerStarted","Data":"e0ce973198b9fcf106c7b347603583e06704a01aef160be0aed4bb6e545137c7"} Apr 16 20:17:21.315351 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.315330 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" event={"ID":"d3b72025-8484-435b-923d-37fe11906d87","Type":"ContainerStarted","Data":"d2d6479da3bb18a0c976c760627a515b4fa500205d4f5544fac6d50f220cdc33"} Apr 16 20:17:21.430232 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.430212 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:21.579154 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.579075 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jdfl\" (UniqueName: \"kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl\") pod \"c8596bb3-b088-4a5f-aeee-370a7f8de038\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " Apr 16 20:17:21.579154 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.579143 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle\") pod \"c8596bb3-b088-4a5f-aeee-370a7f8de038\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " Apr 16 20:17:21.579367 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.579163 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util\") pod \"c8596bb3-b088-4a5f-aeee-370a7f8de038\" (UID: \"c8596bb3-b088-4a5f-aeee-370a7f8de038\") " Apr 16 20:17:21.579764 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.579739 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle" (OuterVolumeSpecName: "bundle") pod "c8596bb3-b088-4a5f-aeee-370a7f8de038" (UID: "c8596bb3-b088-4a5f-aeee-370a7f8de038"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:21.581136 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.581115 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl" (OuterVolumeSpecName: "kube-api-access-9jdfl") pod "c8596bb3-b088-4a5f-aeee-370a7f8de038" (UID: "c8596bb3-b088-4a5f-aeee-370a7f8de038"). InnerVolumeSpecName "kube-api-access-9jdfl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:17:21.584855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.584831 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util" (OuterVolumeSpecName: "util") pod "c8596bb3-b088-4a5f-aeee-370a7f8de038" (UID: "c8596bb3-b088-4a5f-aeee-370a7f8de038"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:17:21.680359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.680324 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jdfl\" (UniqueName: \"kubernetes.io/projected/c8596bb3-b088-4a5f-aeee-370a7f8de038-kube-api-access-9jdfl\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:17:21.680359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.680354 2563 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-bundle\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:17:21.680359 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:21.680366 2563 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8596bb3-b088-4a5f-aeee-370a7f8de038-util\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:17:22.319770 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:22.319736 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" event={"ID":"c8596bb3-b088-4a5f-aeee-370a7f8de038","Type":"ContainerDied","Data":"ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f"} Apr 16 20:17:22.319770 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:22.319758 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29clm78q" Apr 16 20:17:22.319770 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:22.319772 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccfcd01a1e3ed4f8e9094002a2cbb59ca7cd6aa6aad7afd41bfa4c234832be0f" Apr 16 20:17:25.332373 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:25.332289 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" event={"ID":"b51b910c-b6f1-4c2b-ac6a-8ae576878e74","Type":"ContainerStarted","Data":"bc202c1aa0e8fc37f4a27abbd089843b90d4496f3b2d00b59be03e47a6aa42f8"} Apr 16 20:17:27.339549 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:27.339511 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" event={"ID":"d3b72025-8484-435b-923d-37fe11906d87","Type":"ContainerStarted","Data":"a674f163f3dde53df0d1ad335ac4b0c8f18bd1a9cb2e2023924858c36a79df19"} Apr 16 20:17:27.369278 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:27.368659 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-67ddbb5f65-9c8vh" podStartSLOduration=1.737957634 podStartE2EDuration="7.368642136s" podCreationTimestamp="2026-04-16 20:17:20 +0000 UTC" firstStartedPulling="2026-04-16 20:17:20.655022483 +0000 UTC m=+337.943948247" lastFinishedPulling="2026-04-16 20:17:26.285706983 +0000 UTC m=+343.574632749" observedRunningTime="2026-04-16 20:17:27.368346607 +0000 UTC m=+344.657272392" watchObservedRunningTime="2026-04-16 20:17:27.368642136 +0000 UTC m=+344.657567922" Apr 16 20:17:28.344644 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:28.344610 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" event={"ID":"b51b910c-b6f1-4c2b-ac6a-8ae576878e74","Type":"ContainerStarted","Data":"e3038d136b37d07aab55571eabc5ac09a40ae66a8d6df6e71b40b0e96765291c"} Apr 16 20:17:28.344644 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:28.344646 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" event={"ID":"b51b910c-b6f1-4c2b-ac6a-8ae576878e74","Type":"ContainerStarted","Data":"2fb0c2e1e048ab718df4fdc84e9f61ee10da218e9afc1d0e10b32c871a506cc6"} Apr 16 20:17:28.370373 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:17:28.370332 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5cdc6fc5bb-sff2h" podStartSLOduration=1.425318653 podStartE2EDuration="8.37031813s" podCreationTimestamp="2026-04-16 20:17:20 +0000 UTC" firstStartedPulling="2026-04-16 20:17:20.92337555 +0000 UTC m=+338.212301334" lastFinishedPulling="2026-04-16 20:17:27.868375035 +0000 UTC m=+345.157300811" observedRunningTime="2026-04-16 20:17:28.368111603 +0000 UTC m=+345.657037389" watchObservedRunningTime="2026-04-16 20:17:28.37031813 +0000 UTC m=+345.659243959" Apr 16 20:20:14.837743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.837715 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-557c549d6c-nmprp"] Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.837996 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="pull" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838006 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="pull" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838025 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="util" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838030 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="util" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838045 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="extract" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838051 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="extract" Apr 16 20:20:14.838206 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.838100 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8596bb3-b088-4a5f-aeee-370a7f8de038" containerName="extract" Apr 16 20:20:14.840841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.840820 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.865039 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.865016 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557c549d6c-nmprp"] Apr 16 20:20:14.957900 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.957876 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-oauth-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.957999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.957905 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-oauth-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.957999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.957932 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.958067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.958017 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.958067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.958063 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-service-ca\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.958147 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.958131 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jjpr\" (UniqueName: \"kubernetes.io/projected/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-kube-api-access-5jjpr\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:14.958182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:14.958159 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-trusted-ca-bundle\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059299 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059270 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059309 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-service-ca\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059342 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jjpr\" (UniqueName: \"kubernetes.io/projected/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-kube-api-access-5jjpr\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059360 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-trusted-ca-bundle\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059378 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-oauth-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059394 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-oauth-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.059634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.059411 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.060089 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.060068 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.060165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.060118 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-service-ca\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.060255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.060237 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-trusted-ca-bundle\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.060313 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.060240 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-oauth-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.061966 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.061943 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-serving-cert\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.062041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.061954 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-console-oauth-config\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.068551 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.068532 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jjpr\" (UniqueName: \"kubernetes.io/projected/9aaa9c1c-15de-4870-a54f-2ad2e97034e2-kube-api-access-5jjpr\") pod \"console-557c549d6c-nmprp\" (UID: \"9aaa9c1c-15de-4870-a54f-2ad2e97034e2\") " pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.149741 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.149693 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:15.270975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.270952 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-557c549d6c-nmprp"] Apr 16 20:20:15.272882 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:20:15.272855 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aaa9c1c_15de_4870_a54f_2ad2e97034e2.slice/crio-160fc371da98d1e776aafb1547fcbd7f63ab2d6c84bfa5bd40e956651370a8f5 WatchSource:0}: Error finding container 160fc371da98d1e776aafb1547fcbd7f63ab2d6c84bfa5bd40e956651370a8f5: Status 404 returned error can't find the container with id 160fc371da98d1e776aafb1547fcbd7f63ab2d6c84bfa5bd40e956651370a8f5 Apr 16 20:20:15.831801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.831768 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557c549d6c-nmprp" event={"ID":"9aaa9c1c-15de-4870-a54f-2ad2e97034e2","Type":"ContainerStarted","Data":"7e689711e237e682569c31187e27ed99d478782652e3f721b3b5868964249f8f"} Apr 16 20:20:15.831801 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.831801 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-557c549d6c-nmprp" event={"ID":"9aaa9c1c-15de-4870-a54f-2ad2e97034e2","Type":"ContainerStarted","Data":"160fc371da98d1e776aafb1547fcbd7f63ab2d6c84bfa5bd40e956651370a8f5"} Apr 16 20:20:15.855180 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:15.855134 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-557c549d6c-nmprp" podStartSLOduration=1.855120071 podStartE2EDuration="1.855120071s" podCreationTimestamp="2026-04-16 20:20:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:20:15.853363509 +0000 UTC m=+513.142289307" watchObservedRunningTime="2026-04-16 20:20:15.855120071 +0000 UTC m=+513.144045856" Apr 16 20:20:25.150524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:25.150491 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:25.150524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:25.150528 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:25.155372 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:25.155347 2563 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:25.864342 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:25.864316 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-557c549d6c-nmprp" Apr 16 20:20:25.918687 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:25.918653 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:20:28.336746 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.336707 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4"] Apr 16 20:20:28.340163 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.340144 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.342840 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.342814 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 20:20:28.342840 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.342832 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 20:20:28.344058 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.344021 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 20:20:28.344175 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.344160 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-wl6nb\"" Apr 16 20:20:28.344387 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.344360 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 20:20:28.346163 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.346143 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4"] Apr 16 20:20:28.368187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.368147 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.368367 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.368213 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2a789e9-ca26-4724-a7f7-40f33ee87848-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.368367 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.368245 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j25zx\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-kube-api-access-j25zx\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.469105 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.469063 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.469246 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.469131 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2a789e9-ca26-4724-a7f7-40f33ee87848-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.469246 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.469157 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j25zx\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-kube-api-access-j25zx\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.469585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.469539 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/b2a789e9-ca26-4724-a7f7-40f33ee87848-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.471501 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.471479 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.478366 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.478335 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j25zx\" (UniqueName: \"kubernetes.io/projected/b2a789e9-ca26-4724-a7f7-40f33ee87848-kube-api-access-j25zx\") pod \"seaweedfs-tls-custom-5c88b85bb7-g7mz4\" (UID: \"b2a789e9-ca26-4724-a7f7-40f33ee87848\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.651278 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.651185 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" Apr 16 20:20:28.785508 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.785484 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4"] Apr 16 20:20:28.788071 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:20:28.788044 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2a789e9_ca26_4724_a7f7_40f33ee87848.slice/crio-d733fa9c8224fb83aa8585afe5525589a667414d8781e949ae60369139400804 WatchSource:0}: Error finding container d733fa9c8224fb83aa8585afe5525589a667414d8781e949ae60369139400804: Status 404 returned error can't find the container with id d733fa9c8224fb83aa8585afe5525589a667414d8781e949ae60369139400804 Apr 16 20:20:28.871237 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:28.871187 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" event={"ID":"b2a789e9-ca26-4724-a7f7-40f33ee87848","Type":"ContainerStarted","Data":"d733fa9c8224fb83aa8585afe5525589a667414d8781e949ae60369139400804"} Apr 16 20:20:31.882257 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:31.882220 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" event={"ID":"b2a789e9-ca26-4724-a7f7-40f33ee87848","Type":"ContainerStarted","Data":"e6de9d1bc29bef5b1c8d48ff0f1acf78742527db9c37579d6b2cf7a208fca7f1"} Apr 16 20:20:31.898927 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:31.898870 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-g7mz4" podStartSLOduration=1.478555833 podStartE2EDuration="3.898857598s" podCreationTimestamp="2026-04-16 20:20:28 +0000 UTC" firstStartedPulling="2026-04-16 20:20:28.789485545 +0000 UTC m=+526.078411312" lastFinishedPulling="2026-04-16 20:20:31.209787311 +0000 UTC m=+528.498713077" observedRunningTime="2026-04-16 20:20:31.898446597 +0000 UTC m=+529.187372413" watchObservedRunningTime="2026-04-16 20:20:31.898857598 +0000 UTC m=+529.187783383" Apr 16 20:20:50.937702 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:50.937646 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c97ddb46f-zdfxp" podUID="db85e1fb-db90-4d76-9f4a-58eddacb2260" containerName="console" containerID="cri-o://79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea" gracePeriod=15 Apr 16 20:20:51.178222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.178198 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c97ddb46f-zdfxp_db85e1fb-db90-4d76-9f4a-58eddacb2260/console/0.log" Apr 16 20:20:51.178346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.178261 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:20:51.278213 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278106 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278213 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278157 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278213 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278211 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqmx4\" (UniqueName: \"kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278499 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278249 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278499 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278298 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278499 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278374 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278499 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278444 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert\") pod \"db85e1fb-db90-4d76-9f4a-58eddacb2260\" (UID: \"db85e1fb-db90-4d76-9f4a-58eddacb2260\") " Apr 16 20:20:51.278723 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278553 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config" (OuterVolumeSpecName: "console-config") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:51.278872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278836 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:51.279010 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.278948 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:51.279138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.279113 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca" (OuterVolumeSpecName: "service-ca") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:20:51.280520 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.280499 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:20:51.280621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.280521 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 20:20:51.280621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.280532 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4" (OuterVolumeSpecName: "kube-api-access-sqmx4") pod "db85e1fb-db90-4d76-9f4a-58eddacb2260" (UID: "db85e1fb-db90-4d76-9f4a-58eddacb2260"). InnerVolumeSpecName "kube-api-access-sqmx4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380105 2563 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-oauth-config\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380144 2563 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-trusted-ca-bundle\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380161 2563 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-oauth-serving-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380175 2563 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-config\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380191 2563 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/db85e1fb-db90-4d76-9f4a-58eddacb2260-console-serving-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380206 2563 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqmx4\" (UniqueName: \"kubernetes.io/projected/db85e1fb-db90-4d76-9f4a-58eddacb2260-kube-api-access-sqmx4\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.380275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.380223 2563 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/db85e1fb-db90-4d76-9f4a-58eddacb2260-service-ca\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:20:51.941458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941431 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c97ddb46f-zdfxp_db85e1fb-db90-4d76-9f4a-58eddacb2260/console/0.log" Apr 16 20:20:51.941991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941470 2563 generic.go:358] "Generic (PLEG): container finished" podID="db85e1fb-db90-4d76-9f4a-58eddacb2260" containerID="79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea" exitCode=2 Apr 16 20:20:51.941991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941514 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c97ddb46f-zdfxp" event={"ID":"db85e1fb-db90-4d76-9f4a-58eddacb2260","Type":"ContainerDied","Data":"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea"} Apr 16 20:20:51.941991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941535 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c97ddb46f-zdfxp" Apr 16 20:20:51.941991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941552 2563 scope.go:117] "RemoveContainer" containerID="79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea" Apr 16 20:20:51.941991 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.941538 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c97ddb46f-zdfxp" event={"ID":"db85e1fb-db90-4d76-9f4a-58eddacb2260","Type":"ContainerDied","Data":"e7f36ec5632451c9099955a1fa163fd89bf033fba5af721a7ca169517bad0e85"} Apr 16 20:20:51.950289 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.950268 2563 scope.go:117] "RemoveContainer" containerID="79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea" Apr 16 20:20:51.950551 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:20:51.950530 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea\": container with ID starting with 79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea not found: ID does not exist" containerID="79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea" Apr 16 20:20:51.950628 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.950573 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea"} err="failed to get container status \"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea\": rpc error: code = NotFound desc = could not find container \"79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea\": container with ID starting with 79242f483591d9cb2be714d9db935c5c112091eb551b7668e3785145744230ea not found: ID does not exist" Apr 16 20:20:51.959884 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.959862 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:20:51.964322 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:51.964299 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c97ddb46f-zdfxp"] Apr 16 20:20:53.299280 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:53.299244 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db85e1fb-db90-4d76-9f4a-58eddacb2260" path="/var/lib/kubelet/pods/db85e1fb-db90-4d76-9f4a-58eddacb2260/volumes" Apr 16 20:20:58.717761 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.717732 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:20:58.718127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.718054 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="db85e1fb-db90-4d76-9f4a-58eddacb2260" containerName="console" Apr 16 20:20:58.718127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.718065 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="db85e1fb-db90-4d76-9f4a-58eddacb2260" containerName="console" Apr 16 20:20:58.718205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.718138 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="db85e1fb-db90-4d76-9f4a-58eddacb2260" containerName="console" Apr 16 20:20:58.723664 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.723644 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:20:58.726574 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.726526 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-757nb\"" Apr 16 20:20:58.728894 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.728869 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:20:58.843122 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.843090 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-8slx5\" (UID: \"642a35c5-ec05-47eb-82c4-42ad07b032c5\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:20:58.944192 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.944152 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-8slx5\" (UID: \"642a35c5-ec05-47eb-82c4-42ad07b032c5\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:20:58.944552 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:58.944530 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-7cf94896c9-8slx5\" (UID: \"642a35c5-ec05-47eb-82c4-42ad07b032c5\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:20:59.034589 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:59.034463 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:20:59.159276 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:59.159243 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:20:59.161606 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:20:59.161576 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642a35c5_ec05_47eb_82c4_42ad07b032c5.slice/crio-b4e108f1f8304a4f67ac826b536e91fcf5c0e1891ab8fbc8164e84075576557a WatchSource:0}: Error finding container b4e108f1f8304a4f67ac826b536e91fcf5c0e1891ab8fbc8164e84075576557a: Status 404 returned error can't find the container with id b4e108f1f8304a4f67ac826b536e91fcf5c0e1891ab8fbc8164e84075576557a Apr 16 20:20:59.968970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:20:59.968929 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerStarted","Data":"b4e108f1f8304a4f67ac826b536e91fcf5c0e1891ab8fbc8164e84075576557a"} Apr 16 20:21:02.981638 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:02.981598 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerStarted","Data":"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0"} Apr 16 20:21:06.995648 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:06.995616 2563 generic.go:358] "Generic (PLEG): container finished" podID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerID="e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0" exitCode=0 Apr 16 20:21:06.996122 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:06.995705 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerDied","Data":"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0"} Apr 16 20:21:20.049762 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:20.049719 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerStarted","Data":"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d"} Apr 16 20:21:22.058950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:22.058849 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerStarted","Data":"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf"} Apr 16 20:21:22.059391 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:22.059173 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:21:22.060482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:22.060441 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:22.076772 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:22.076725 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podStartSLOduration=1.438517895 podStartE2EDuration="24.076710566s" podCreationTimestamp="2026-04-16 20:20:58 +0000 UTC" firstStartedPulling="2026-04-16 20:20:59.163615658 +0000 UTC m=+556.452541421" lastFinishedPulling="2026-04-16 20:21:21.801808329 +0000 UTC m=+579.090734092" observedRunningTime="2026-04-16 20:21:22.075046546 +0000 UTC m=+579.363972332" watchObservedRunningTime="2026-04-16 20:21:22.076710566 +0000 UTC m=+579.365636350" Apr 16 20:21:23.062070 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:23.062040 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:21:23.062631 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:23.062150 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:23.062971 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:23.062948 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:24.065275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:24.065233 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:24.065683 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:24.065575 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:34.065573 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:34.065514 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:34.066088 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:34.065972 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:44.066035 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:44.065979 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:44.066530 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:44.066493 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:21:54.066148 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:54.066101 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:21:54.066674 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:21:54.066605 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:04.065795 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:04.065695 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:22:04.068394 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:04.066204 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:14.066208 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:14.066156 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:22:14.066634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:14.066607 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:24.066707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:24.066672 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:22:24.067125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:24.066731 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:22:33.917334 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:33.917297 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:22:33.917816 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:33.917673 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" containerID="cri-o://2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d" gracePeriod=30 Apr 16 20:22:33.917886 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:33.917778 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" containerID="cri-o://4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf" gracePeriod=30 Apr 16 20:22:33.993749 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:33.993716 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:22:33.997252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:33.997230 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:34.006155 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.006133 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:22:34.065343 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.065310 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:22:34.065695 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.065667 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:34.113796 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.113769 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6\" (UID: \"0f271744-c776-48b5-9e2d-b40daa021c09\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:34.214333 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.214271 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6\" (UID: \"0f271744-c776-48b5-9e2d-b40daa021c09\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:34.214637 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.214617 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6\" (UID: \"0f271744-c776-48b5-9e2d-b40daa021c09\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:34.308377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.308339 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:34.423875 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.423809 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:22:34.426201 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:22:34.426171 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f271744_c776_48b5_9e2d_b40daa021c09.slice/crio-d2b32a808078c032186cdcb70888551c1674f4ed3a233e39ff80328395d1efb9 WatchSource:0}: Error finding container d2b32a808078c032186cdcb70888551c1674f4ed3a233e39ff80328395d1efb9: Status 404 returned error can't find the container with id d2b32a808078c032186cdcb70888551c1674f4ed3a233e39ff80328395d1efb9 Apr 16 20:22:34.427970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:34.427953 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:22:35.278299 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:35.278266 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerStarted","Data":"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8"} Apr 16 20:22:35.278299 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:35.278299 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerStarted","Data":"d2b32a808078c032186cdcb70888551c1674f4ed3a233e39ff80328395d1efb9"} Apr 16 20:22:38.288357 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:38.288276 2563 generic.go:358] "Generic (PLEG): container finished" podID="0f271744-c776-48b5-9e2d-b40daa021c09" containerID="0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8" exitCode=0 Apr 16 20:22:38.288357 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:38.288343 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerDied","Data":"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8"} Apr 16 20:22:38.290316 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:38.290293 2563 generic.go:358] "Generic (PLEG): container finished" podID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerID="2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d" exitCode=0 Apr 16 20:22:38.290424 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:38.290365 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerDied","Data":"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d"} Apr 16 20:22:39.298141 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:39.298107 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerStarted","Data":"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f"} Apr 16 20:22:39.298141 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:39.298141 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerStarted","Data":"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a"} Apr 16 20:22:39.298683 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:39.298465 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:39.299663 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:39.299626 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:22:39.315355 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:39.315315 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podStartSLOduration=6.315302664 podStartE2EDuration="6.315302664s" podCreationTimestamp="2026-04-16 20:22:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:22:39.313156642 +0000 UTC m=+656.602082429" watchObservedRunningTime="2026-04-16 20:22:39.315302664 +0000 UTC m=+656.604228449" Apr 16 20:22:40.299009 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:40.298977 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:22:40.299457 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:40.299067 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:22:40.300015 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:40.299988 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:41.302057 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:41.302016 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:22:41.302432 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:41.302307 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:44.066159 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:44.066116 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:22:44.066630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:44.066428 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:51.302926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:51.302882 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:22:51.303322 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:51.303294 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:54.065430 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:54.065382 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 20:22:54.065855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:54.065540 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:22:54.065855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:54.065720 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:22:54.065855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:22:54.065816 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:23:01.302503 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:01.302456 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:23:01.303012 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:01.302914 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:04.060838 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.060814 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:23:04.147784 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.147746 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location\") pod \"642a35c5-ec05-47eb-82c4-42ad07b032c5\" (UID: \"642a35c5-ec05-47eb-82c4-42ad07b032c5\") " Apr 16 20:23:04.148086 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.148067 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "642a35c5-ec05-47eb-82c4-42ad07b032c5" (UID: "642a35c5-ec05-47eb-82c4-42ad07b032c5"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:23:04.248901 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.248819 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/642a35c5-ec05-47eb-82c4-42ad07b032c5-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:23:04.381492 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.381450 2563 generic.go:358] "Generic (PLEG): container finished" podID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerID="4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf" exitCode=0 Apr 16 20:23:04.381685 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.381524 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" Apr 16 20:23:04.381685 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.381520 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerDied","Data":"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf"} Apr 16 20:23:04.381685 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.381599 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5" event={"ID":"642a35c5-ec05-47eb-82c4-42ad07b032c5","Type":"ContainerDied","Data":"b4e108f1f8304a4f67ac826b536e91fcf5c0e1891ab8fbc8164e84075576557a"} Apr 16 20:23:04.381685 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.381619 2563 scope.go:117] "RemoveContainer" containerID="4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf" Apr 16 20:23:04.389664 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.389647 2563 scope.go:117] "RemoveContainer" containerID="2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d" Apr 16 20:23:04.398834 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.398807 2563 scope.go:117] "RemoveContainer" containerID="e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0" Apr 16 20:23:04.406050 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406030 2563 scope.go:117] "RemoveContainer" containerID="4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf" Apr 16 20:23:04.406287 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:23:04.406268 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf\": container with ID starting with 4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf not found: ID does not exist" containerID="4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf" Apr 16 20:23:04.406341 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406303 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf"} err="failed to get container status \"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf\": rpc error: code = NotFound desc = could not find container \"4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf\": container with ID starting with 4cd19d22a6d3c5c77b7aca47f50bf271a27909b8947fd7213f57d3490d9cecaf not found: ID does not exist" Apr 16 20:23:04.406341 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406320 2563 scope.go:117] "RemoveContainer" containerID="2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d" Apr 16 20:23:04.406590 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:23:04.406572 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d\": container with ID starting with 2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d not found: ID does not exist" containerID="2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d" Apr 16 20:23:04.406634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406596 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d"} err="failed to get container status \"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d\": rpc error: code = NotFound desc = could not find container \"2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d\": container with ID starting with 2d1eb7781a4e3d6490023d75e11562da57791aeff59f4f128d6d6bcc0561423d not found: ID does not exist" Apr 16 20:23:04.406634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406612 2563 scope.go:117] "RemoveContainer" containerID="e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0" Apr 16 20:23:04.406835 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:23:04.406818 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0\": container with ID starting with e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0 not found: ID does not exist" containerID="e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0" Apr 16 20:23:04.406877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.406840 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0"} err="failed to get container status \"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0\": rpc error: code = NotFound desc = could not find container \"e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0\": container with ID starting with e675e35c5c074a6c6bf24a719ceabd174b96f803d418a899a97a91e9464fb4a0 not found: ID does not exist" Apr 16 20:23:04.410483 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.410462 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:23:04.413610 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:04.413589 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-7cf94896c9-8slx5"] Apr 16 20:23:05.298650 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:05.298620 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" path="/var/lib/kubelet/pods/642a35c5-ec05-47eb-82c4-42ad07b032c5/volumes" Apr 16 20:23:11.301978 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:11.301930 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:23:11.302583 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:11.302346 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:21.302712 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:21.302663 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:23:21.303135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:21.303104 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:31.302698 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:31.302657 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:23:31.303171 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:31.303088 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:41.302621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:41.302519 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:23:41.302983 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:41.302688 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:23:49.086490 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.086452 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:23:49.086973 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.086837 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" containerID="cri-o://669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a" gracePeriod=30 Apr 16 20:23:49.086973 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.086933 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" containerID="cri-o://1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f" gracePeriod=30 Apr 16 20:23:49.129466 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129440 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:23:49.129797 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129784 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="storage-initializer" Apr 16 20:23:49.129841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129799 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="storage-initializer" Apr 16 20:23:49.129841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129806 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" Apr 16 20:23:49.129841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129812 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" Apr 16 20:23:49.129841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129819 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" Apr 16 20:23:49.129841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129824 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" Apr 16 20:23:49.129995 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129892 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="agent" Apr 16 20:23:49.129995 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.129903 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="642a35c5-ec05-47eb-82c4-42ad07b032c5" containerName="kserve-container" Apr 16 20:23:49.133913 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.133897 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:23:49.140371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.140345 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:23:49.144317 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.144298 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:23:49.262408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.262253 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:23:49.264773 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:23:49.264742 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d58d9f8_f7a5_4cff_ad7d_e9dc234fb956.slice/crio-a875e8a6ac0955252620f3f372911178a67eed37f0ce6b7b4f1458f75e5e3e9d WatchSource:0}: Error finding container a875e8a6ac0955252620f3f372911178a67eed37f0ce6b7b4f1458f75e5e3e9d: Status 404 returned error can't find the container with id a875e8a6ac0955252620f3f372911178a67eed37f0ce6b7b4f1458f75e5e3e9d Apr 16 20:23:49.523341 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:49.523311 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" event={"ID":"4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956","Type":"ContainerStarted","Data":"a875e8a6ac0955252620f3f372911178a67eed37f0ce6b7b4f1458f75e5e3e9d"} Apr 16 20:23:50.527863 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:50.527792 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" event={"ID":"4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956","Type":"ContainerStarted","Data":"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154"} Apr 16 20:23:50.528244 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:50.527999 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:23:50.529735 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:50.529708 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:23:50.542974 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:50.542929 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" podStartSLOduration=0.547243453 podStartE2EDuration="1.54291508s" podCreationTimestamp="2026-04-16 20:23:49 +0000 UTC" firstStartedPulling="2026-04-16 20:23:49.266698969 +0000 UTC m=+726.555624732" lastFinishedPulling="2026-04-16 20:23:50.262370593 +0000 UTC m=+727.551296359" observedRunningTime="2026-04-16 20:23:50.541018256 +0000 UTC m=+727.829944042" watchObservedRunningTime="2026-04-16 20:23:50.54291508 +0000 UTC m=+727.831840865" Apr 16 20:23:51.302677 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:51.302638 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:23:51.302952 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:51.302932 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:23:53.539824 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:53.539792 2563 generic.go:358] "Generic (PLEG): container finished" podID="0f271744-c776-48b5-9e2d-b40daa021c09" containerID="669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a" exitCode=0 Apr 16 20:23:53.540191 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:53.539862 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerDied","Data":"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a"} Apr 16 20:23:59.184120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.184088 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:23:59.188016 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.187995 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:23:59.193305 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.193282 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:23:59.253763 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.253728 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-f8q5c\" (UID: \"d33ea5de-e54b-42bb-889a-88aedb6f347e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:23:59.354516 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.354491 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-f8q5c\" (UID: \"d33ea5de-e54b-42bb-889a-88aedb6f347e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:23:59.354856 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.354839 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location\") pod \"isvc-logger-predictor-7db75b5b6d-f8q5c\" (UID: \"d33ea5de-e54b-42bb-889a-88aedb6f347e\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:23:59.499594 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.499504 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:23:59.638883 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:23:59.638855 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:23:59.640894 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:23:59.640870 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd33ea5de_e54b_42bb_889a_88aedb6f347e.slice/crio-1314a8b196454b2a7956e7f557b669008f2c73135a522c9e5d9b93d3bf716334 WatchSource:0}: Error finding container 1314a8b196454b2a7956e7f557b669008f2c73135a522c9e5d9b93d3bf716334: Status 404 returned error can't find the container with id 1314a8b196454b2a7956e7f557b669008f2c73135a522c9e5d9b93d3bf716334 Apr 16 20:24:00.563770 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:00.563737 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerStarted","Data":"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851"} Apr 16 20:24:00.563770 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:00.563771 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerStarted","Data":"1314a8b196454b2a7956e7f557b669008f2c73135a522c9e5d9b93d3bf716334"} Apr 16 20:24:01.301965 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:01.301919 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:24:01.302233 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:01.302206 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:03.575152 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:03.575119 2563 generic.go:358] "Generic (PLEG): container finished" podID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerID="07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851" exitCode=0 Apr 16 20:24:03.575497 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:03.575175 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerDied","Data":"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851"} Apr 16 20:24:04.580383 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.580347 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerStarted","Data":"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d"} Apr 16 20:24:04.580383 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.580387 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerStarted","Data":"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6"} Apr 16 20:24:04.580830 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.580695 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:24:04.580830 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.580725 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:24:04.582125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.582086 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:04.582771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.582742 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:04.598205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:04.598161 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podStartSLOduration=5.5981498179999996 podStartE2EDuration="5.598149818s" podCreationTimestamp="2026-04-16 20:23:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:24:04.59682521 +0000 UTC m=+741.885750994" watchObservedRunningTime="2026-04-16 20:24:04.598149818 +0000 UTC m=+741.887075604" Apr 16 20:24:05.584518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:05.584475 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:05.584983 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:05.584889 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:11.302149 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:11.302107 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:5000: connect: connection refused" Apr 16 20:24:11.302597 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:11.302216 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:24:11.302597 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:11.302525 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:11.302676 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:11.302617 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:24:15.585424 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:15.585376 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:15.585884 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:15.585843 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:19.235718 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.235690 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:24:19.313334 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.313289 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location\") pod \"0f271744-c776-48b5-9e2d-b40daa021c09\" (UID: \"0f271744-c776-48b5-9e2d-b40daa021c09\") " Apr 16 20:24:19.313640 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.313614 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0f271744-c776-48b5-9e2d-b40daa021c09" (UID: "0f271744-c776-48b5-9e2d-b40daa021c09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:24:19.414996 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.414906 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0f271744-c776-48b5-9e2d-b40daa021c09-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:24:19.632372 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.632327 2563 generic.go:358] "Generic (PLEG): container finished" podID="0f271744-c776-48b5-9e2d-b40daa021c09" containerID="1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f" exitCode=0 Apr 16 20:24:19.632585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.632428 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerDied","Data":"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f"} Apr 16 20:24:19.632585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.632472 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" event={"ID":"0f271744-c776-48b5-9e2d-b40daa021c09","Type":"ContainerDied","Data":"d2b32a808078c032186cdcb70888551c1674f4ed3a233e39ff80328395d1efb9"} Apr 16 20:24:19.632585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.632488 2563 scope.go:117] "RemoveContainer" containerID="1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f" Apr 16 20:24:19.632585 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.632436 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6" Apr 16 20:24:19.642136 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.642108 2563 scope.go:117] "RemoveContainer" containerID="669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a" Apr 16 20:24:19.650430 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.650407 2563 scope.go:117] "RemoveContainer" containerID="0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8" Apr 16 20:24:19.656843 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.656811 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:24:19.659337 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.659314 2563 scope.go:117] "RemoveContainer" containerID="1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f" Apr 16 20:24:19.659777 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:24:19.659752 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f\": container with ID starting with 1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f not found: ID does not exist" containerID="1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f" Apr 16 20:24:19.659867 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.659780 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-6467bf5b48-r5jf6"] Apr 16 20:24:19.659867 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.659786 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f"} err="failed to get container status \"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f\": rpc error: code = NotFound desc = could not find container \"1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f\": container with ID starting with 1acaa320760a48339323d98ba457c8330cd4b633913d767fc85fbf5de4e6cf7f not found: ID does not exist" Apr 16 20:24:19.659867 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.659813 2563 scope.go:117] "RemoveContainer" containerID="669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a" Apr 16 20:24:19.660052 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:24:19.660033 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a\": container with ID starting with 669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a not found: ID does not exist" containerID="669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a" Apr 16 20:24:19.660090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.660060 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a"} err="failed to get container status \"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a\": rpc error: code = NotFound desc = could not find container \"669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a\": container with ID starting with 669ab9e463a3b3d770c200d510d762c1b47ded4169a5d784bf0c7db7c3aed80a not found: ID does not exist" Apr 16 20:24:19.660090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.660078 2563 scope.go:117] "RemoveContainer" containerID="0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8" Apr 16 20:24:19.660295 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:24:19.660279 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8\": container with ID starting with 0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8 not found: ID does not exist" containerID="0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8" Apr 16 20:24:19.660348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:19.660298 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8"} err="failed to get container status \"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8\": rpc error: code = NotFound desc = could not find container \"0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8\": container with ID starting with 0ec582eae8f8bcc619e8882c31affb4f3f878d61d01f2558d4aca04d69d1fda8 not found: ID does not exist" Apr 16 20:24:21.299109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:21.299077 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" path="/var/lib/kubelet/pods/0f271744-c776-48b5-9e2d-b40daa021c09/volumes" Apr 16 20:24:25.585433 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:25.585390 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:25.585925 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:25.585901 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:35.585326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:35.585274 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:35.585749 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:35.585662 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:45.584778 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:45.584727 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:45.585251 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:45.585137 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:24:55.584588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:55.584496 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:24:55.585109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:24:55.584995 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:25:05.584872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:05.584764 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:25:05.585326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:05.585304 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:25:15.585444 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:15.585413 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:15.585878 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:15.585527 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:24.236777 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.236745 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-xtdnt_4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956/kserve-container/0.log" Apr 16 20:25:24.390143 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.390108 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:25:24.390415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.390393 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" containerID="cri-o://6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6" gracePeriod=30 Apr 16 20:25:24.390649 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.390589 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" containerID="cri-o://039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d" gracePeriod=30 Apr 16 20:25:24.440536 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440495 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:25:24.440918 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440900 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" Apr 16 20:25:24.440999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440921 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" Apr 16 20:25:24.440999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440944 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" Apr 16 20:25:24.440999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440954 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" Apr 16 20:25:24.440999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440968 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="storage-initializer" Apr 16 20:25:24.440999 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.440977 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="storage-initializer" Apr 16 20:25:24.441245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.441060 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="kserve-container" Apr 16 20:25:24.441245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.441073 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="0f271744-c776-48b5-9e2d-b40daa021c09" containerName="agent" Apr 16 20:25:24.444135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.444116 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:24.450886 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.450838 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:25:24.506716 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.506632 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:25:24.506912 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.506890 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" podUID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" containerName="kserve-container" containerID="cri-o://c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154" gracePeriod=30 Apr 16 20:25:24.555753 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.555709 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-q9ch2\" (UID: \"482fe637-8ba3-40ac-9df2-992d3070b20c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:24.656782 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.656748 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-q9ch2\" (UID: \"482fe637-8ba3-40ac-9df2-992d3070b20c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:24.657209 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.657187 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-q9ch2\" (UID: \"482fe637-8ba3-40ac-9df2-992d3070b20c\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:24.744960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.744937 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:25:24.756039 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.756015 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:24.851157 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.851126 2563 generic.go:358] "Generic (PLEG): container finished" podID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" containerID="c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154" exitCode=2 Apr 16 20:25:24.851296 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.851196 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" Apr 16 20:25:24.851296 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.851192 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" event={"ID":"4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956","Type":"ContainerDied","Data":"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154"} Apr 16 20:25:24.851411 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.851297 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt" event={"ID":"4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956","Type":"ContainerDied","Data":"a875e8a6ac0955252620f3f372911178a67eed37f0ce6b7b4f1458f75e5e3e9d"} Apr 16 20:25:24.851411 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.851316 2563 scope.go:117] "RemoveContainer" containerID="c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154" Apr 16 20:25:24.862356 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.862296 2563 scope.go:117] "RemoveContainer" containerID="c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154" Apr 16 20:25:24.862691 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:25:24.862600 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154\": container with ID starting with c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154 not found: ID does not exist" containerID="c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154" Apr 16 20:25:24.862691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.862652 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154"} err="failed to get container status \"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154\": rpc error: code = NotFound desc = could not find container \"c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154\": container with ID starting with c037e9f03c9887a17db115caf51ae404046e98871fc9de1a4e989db7af192154 not found: ID does not exist" Apr 16 20:25:24.878841 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.878819 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:25:24.882216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.882195 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:25:24.884892 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:24.884871 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-xtdnt"] Apr 16 20:25:24.885411 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:25:24.885385 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod482fe637_8ba3_40ac_9df2_992d3070b20c.slice/crio-31e6cad078926ec945c9fb004ef3171e3416c6b38220e561efeb1165274ae630 WatchSource:0}: Error finding container 31e6cad078926ec945c9fb004ef3171e3416c6b38220e561efeb1165274ae630: Status 404 returned error can't find the container with id 31e6cad078926ec945c9fb004ef3171e3416c6b38220e561efeb1165274ae630 Apr 16 20:25:25.298728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:25.298691 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" path="/var/lib/kubelet/pods/4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956/volumes" Apr 16 20:25:25.585464 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:25.585356 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:25:25.585676 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:25.585649 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:25:25.855967 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:25.855881 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerStarted","Data":"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670"} Apr 16 20:25:25.855967 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:25.855919 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerStarted","Data":"31e6cad078926ec945c9fb004ef3171e3416c6b38220e561efeb1165274ae630"} Apr 16 20:25:28.868489 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:28.868399 2563 generic.go:358] "Generic (PLEG): container finished" podID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerID="6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6" exitCode=0 Apr 16 20:25:28.868489 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:28.868473 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerDied","Data":"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6"} Apr 16 20:25:28.869759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:28.869734 2563 generic.go:358] "Generic (PLEG): container finished" podID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerID="16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670" exitCode=0 Apr 16 20:25:28.869861 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:28.869808 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerDied","Data":"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670"} Apr 16 20:25:35.585513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.585461 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:25:35.585925 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.585795 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:25:35.898448 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.898363 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerStarted","Data":"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7"} Apr 16 20:25:35.898677 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.898655 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:25:35.899814 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.899784 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:25:35.914339 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:35.914278 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podStartSLOduration=5.690862006 podStartE2EDuration="11.914260852s" podCreationTimestamp="2026-04-16 20:25:24 +0000 UTC" firstStartedPulling="2026-04-16 20:25:28.871072263 +0000 UTC m=+826.159998026" lastFinishedPulling="2026-04-16 20:25:35.094471106 +0000 UTC m=+832.383396872" observedRunningTime="2026-04-16 20:25:35.912857852 +0000 UTC m=+833.201783662" watchObservedRunningTime="2026-04-16 20:25:35.914260852 +0000 UTC m=+833.203186638" Apr 16 20:25:36.902228 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:36.902185 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:25:45.584545 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:45.584498 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.30:8080: connect: connection refused" Apr 16 20:25:45.584985 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:45.584641 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:45.584985 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:45.584857 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 20:25:45.584985 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:45.584932 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:46.902633 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:46.902594 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:25:54.574205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.574179 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:54.593365 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.593342 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location\") pod \"d33ea5de-e54b-42bb-889a-88aedb6f347e\" (UID: \"d33ea5de-e54b-42bb-889a-88aedb6f347e\") " Apr 16 20:25:54.593733 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.593706 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d33ea5de-e54b-42bb-889a-88aedb6f347e" (UID: "d33ea5de-e54b-42bb-889a-88aedb6f347e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:25:54.694636 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.694606 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d33ea5de-e54b-42bb-889a-88aedb6f347e-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:25:54.963216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.963126 2563 generic.go:358] "Generic (PLEG): container finished" podID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerID="039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d" exitCode=137 Apr 16 20:25:54.963352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.963212 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerDied","Data":"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d"} Apr 16 20:25:54.963352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.963241 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" Apr 16 20:25:54.963352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.963253 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c" event={"ID":"d33ea5de-e54b-42bb-889a-88aedb6f347e","Type":"ContainerDied","Data":"1314a8b196454b2a7956e7f557b669008f2c73135a522c9e5d9b93d3bf716334"} Apr 16 20:25:54.963352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.963271 2563 scope.go:117] "RemoveContainer" containerID="039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d" Apr 16 20:25:54.971424 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.971397 2563 scope.go:117] "RemoveContainer" containerID="6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6" Apr 16 20:25:54.978082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.978065 2563 scope.go:117] "RemoveContainer" containerID="07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851" Apr 16 20:25:54.984161 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.984142 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:25:54.985100 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.985087 2563 scope.go:117] "RemoveContainer" containerID="039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d" Apr 16 20:25:54.985368 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:25:54.985297 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d\": container with ID starting with 039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d not found: ID does not exist" containerID="039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d" Apr 16 20:25:54.985434 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.985378 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d"} err="failed to get container status \"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d\": rpc error: code = NotFound desc = could not find container \"039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d\": container with ID starting with 039bdc2467133bd318bfbbf89a97d01667a06f92125e4c507b63a17c0c5b911d not found: ID does not exist" Apr 16 20:25:54.985434 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.985401 2563 scope.go:117] "RemoveContainer" containerID="6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6" Apr 16 20:25:54.985942 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:25:54.985730 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6\": container with ID starting with 6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6 not found: ID does not exist" containerID="6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6" Apr 16 20:25:54.985942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.985759 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6"} err="failed to get container status \"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6\": rpc error: code = NotFound desc = could not find container \"6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6\": container with ID starting with 6a7df36e48f172c348bef12a83d76e8a4b1d8caccbbbf3e23175362bf4c534f6 not found: ID does not exist" Apr 16 20:25:54.985942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.985781 2563 scope.go:117] "RemoveContainer" containerID="07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851" Apr 16 20:25:54.986093 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:25:54.986048 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851\": container with ID starting with 07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851 not found: ID does not exist" containerID="07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851" Apr 16 20:25:54.986093 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.986075 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851"} err="failed to get container status \"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851\": rpc error: code = NotFound desc = could not find container \"07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851\": container with ID starting with 07183f798ed96463949a26f634cb7b6636239b76f0ed2961ef16267a108a7851 not found: ID does not exist" Apr 16 20:25:54.987880 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:54.987862 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-7db75b5b6d-f8q5c"] Apr 16 20:25:55.298161 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:55.298090 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" path="/var/lib/kubelet/pods/d33ea5de-e54b-42bb-889a-88aedb6f347e/volumes" Apr 16 20:25:56.902981 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:25:56.902939 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:06.902741 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:06.902703 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:16.902713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:16.902675 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:26.902855 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:26.902818 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:36.902446 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:36.902360 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:38.294934 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:38.294896 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:48.295485 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:48.295457 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:26:54.596220 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.596189 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:26:54.596593 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.596429 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" containerID="cri-o://f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7" gracePeriod=30 Apr 16 20:26:54.696600 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696552 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:26:54.696938 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696922 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696941 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696955 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696963 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696984 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" containerName="kserve-container" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.696993 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" containerName="kserve-container" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.697007 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="storage-initializer" Apr 16 20:26:54.697022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.697015 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="storage-initializer" Apr 16 20:26:54.697355 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.697096 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="kserve-container" Apr 16 20:26:54.697355 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.697108 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d58d9f8-f7a5-4cff-ad7d-e9dc234fb956" containerName="kserve-container" Apr 16 20:26:54.697355 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.697118 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d33ea5de-e54b-42bb-889a-88aedb6f347e" containerName="agent" Apr 16 20:26:54.701382 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.701363 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:26:54.707452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.707428 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:26:54.828518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.828493 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-657wb\" (UID: \"b5c97a4c-51a6-464c-874a-dd2fdab95474\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:26:54.929794 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.929770 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-657wb\" (UID: \"b5c97a4c-51a6-464c-874a-dd2fdab95474\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:26:54.930100 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:54.930082 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-657wb\" (UID: \"b5c97a4c-51a6-464c-874a-dd2fdab95474\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:26:55.013495 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:55.013472 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:26:55.132511 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:55.132482 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:26:55.135182 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:26:55.135156 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5c97a4c_51a6_464c_874a_dd2fdab95474.slice/crio-82576c08fa90fefa14590a4028a21966f0affdaa33acf8bac511814983d2767c WatchSource:0}: Error finding container 82576c08fa90fefa14590a4028a21966f0affdaa33acf8bac511814983d2767c: Status 404 returned error can't find the container with id 82576c08fa90fefa14590a4028a21966f0affdaa33acf8bac511814983d2767c Apr 16 20:26:55.164834 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:55.164805 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerStarted","Data":"82576c08fa90fefa14590a4028a21966f0affdaa33acf8bac511814983d2767c"} Apr 16 20:26:56.169233 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:56.169198 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerStarted","Data":"0670f67ecc54d8ab13218f2253ddb5ae95429e82e9d03936c9addefd41bf80f9"} Apr 16 20:26:58.295000 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:58.294954 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 20:26:58.633507 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:58.633486 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:26:58.756330 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:58.756299 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location\") pod \"482fe637-8ba3-40ac-9df2-992d3070b20c\" (UID: \"482fe637-8ba3-40ac-9df2-992d3070b20c\") " Apr 16 20:26:58.756641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:58.756616 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "482fe637-8ba3-40ac-9df2-992d3070b20c" (UID: "482fe637-8ba3-40ac-9df2-992d3070b20c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:26:58.857513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:58.857491 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/482fe637-8ba3-40ac-9df2-992d3070b20c-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:26:59.181797 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.181766 2563 generic.go:358] "Generic (PLEG): container finished" podID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerID="f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7" exitCode=0 Apr 16 20:26:59.181946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.181827 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" Apr 16 20:26:59.181946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.181821 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerDied","Data":"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7"} Apr 16 20:26:59.181946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.181931 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2" event={"ID":"482fe637-8ba3-40ac-9df2-992d3070b20c","Type":"ContainerDied","Data":"31e6cad078926ec945c9fb004ef3171e3416c6b38220e561efeb1165274ae630"} Apr 16 20:26:59.182075 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.181950 2563 scope.go:117] "RemoveContainer" containerID="f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7" Apr 16 20:26:59.183291 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.183270 2563 generic.go:358] "Generic (PLEG): container finished" podID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerID="0670f67ecc54d8ab13218f2253ddb5ae95429e82e9d03936c9addefd41bf80f9" exitCode=0 Apr 16 20:26:59.183370 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.183320 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerDied","Data":"0670f67ecc54d8ab13218f2253ddb5ae95429e82e9d03936c9addefd41bf80f9"} Apr 16 20:26:59.190701 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.190674 2563 scope.go:117] "RemoveContainer" containerID="16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670" Apr 16 20:26:59.197652 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.197633 2563 scope.go:117] "RemoveContainer" containerID="f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7" Apr 16 20:26:59.197916 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:26:59.197899 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7\": container with ID starting with f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7 not found: ID does not exist" containerID="f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7" Apr 16 20:26:59.197984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.197922 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7"} err="failed to get container status \"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7\": rpc error: code = NotFound desc = could not find container \"f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7\": container with ID starting with f882d23ba5f9a6f5ad13303c9bdf805b5353c43d67a7266d893bb1ab790e38b7 not found: ID does not exist" Apr 16 20:26:59.197984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.197938 2563 scope.go:117] "RemoveContainer" containerID="16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670" Apr 16 20:26:59.198149 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:26:59.198135 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670\": container with ID starting with 16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670 not found: ID does not exist" containerID="16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670" Apr 16 20:26:59.198189 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.198152 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670"} err="failed to get container status \"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670\": rpc error: code = NotFound desc = could not find container \"16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670\": container with ID starting with 16d341ff89b85733a354faff38747fed2281bce1ba8fe62ea8b99f0394f2a670 not found: ID does not exist" Apr 16 20:26:59.215673 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.215654 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:26:59.217379 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.217361 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-q9ch2"] Apr 16 20:26:59.298672 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:26:59.298649 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" path="/var/lib/kubelet/pods/482fe637-8ba3-40ac-9df2-992d3070b20c/volumes" Apr 16 20:27:00.188829 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:00.188799 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerStarted","Data":"5f2efefabf4aed3164143d5baf7774a0aa48d9a83d72755ec139df06d6f56827"} Apr 16 20:27:00.189096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:00.189075 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:27:00.190082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:00.190058 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:00.204659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:00.204616 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podStartSLOduration=6.204603193 podStartE2EDuration="6.204603193s" podCreationTimestamp="2026-04-16 20:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:27:00.203846285 +0000 UTC m=+917.492772070" watchObservedRunningTime="2026-04-16 20:27:00.204603193 +0000 UTC m=+917.493528978" Apr 16 20:27:01.192577 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:01.192528 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:11.193951 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:11.193913 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:21.193068 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:21.193023 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:31.193187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:31.193135 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:41.193075 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:41.193026 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:27:51.193531 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:27:51.193487 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:28:01.193030 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:01.192983 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.32:8080: connect: connection refused" Apr 16 20:28:11.193796 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:11.193713 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:28:14.972447 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:14.972412 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:28:14.972998 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:14.972754 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" containerID="cri-o://5f2efefabf4aed3164143d5baf7774a0aa48d9a83d72755ec139df06d6f56827" gracePeriod=30 Apr 16 20:28:15.035713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.035680 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:28:15.036002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.035990 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" Apr 16 20:28:15.036048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.036003 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" Apr 16 20:28:15.036048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.036017 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="storage-initializer" Apr 16 20:28:15.036048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.036022 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="storage-initializer" Apr 16 20:28:15.036137 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.036074 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="482fe637-8ba3-40ac-9df2-992d3070b20c" containerName="kserve-container" Apr 16 20:28:15.039011 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.038995 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:28:15.047188 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.047166 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:28:15.070389 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.070362 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq\" (UID: \"12a0452a-d379-4b23-b0f2-bda444ad1266\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:28:15.171345 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.171312 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq\" (UID: \"12a0452a-d379-4b23-b0f2-bda444ad1266\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:28:15.171727 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.171705 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq\" (UID: \"12a0452a-d379-4b23-b0f2-bda444ad1266\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:28:15.350002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.349927 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:28:15.467581 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.467467 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:28:15.469822 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:28:15.469789 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12a0452a_d379_4b23_b0f2_bda444ad1266.slice/crio-ac8ecacd8094a489d20108f4d800fbb2e1c5ed14ecba088fe4292d78d6fbf51e WatchSource:0}: Error finding container ac8ecacd8094a489d20108f4d800fbb2e1c5ed14ecba088fe4292d78d6fbf51e: Status 404 returned error can't find the container with id ac8ecacd8094a489d20108f4d800fbb2e1c5ed14ecba088fe4292d78d6fbf51e Apr 16 20:28:15.471588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:15.471544 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:28:16.425960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:16.425928 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerStarted","Data":"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759"} Apr 16 20:28:16.425960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:16.425964 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerStarted","Data":"ac8ecacd8094a489d20108f4d800fbb2e1c5ed14ecba088fe4292d78d6fbf51e"} Apr 16 20:28:19.439860 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.439825 2563 generic.go:358] "Generic (PLEG): container finished" podID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerID="5f2efefabf4aed3164143d5baf7774a0aa48d9a83d72755ec139df06d6f56827" exitCode=0 Apr 16 20:28:19.440274 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.439896 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerDied","Data":"5f2efefabf4aed3164143d5baf7774a0aa48d9a83d72755ec139df06d6f56827"} Apr 16 20:28:19.441089 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.441069 2563 generic.go:358] "Generic (PLEG): container finished" podID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerID="f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759" exitCode=0 Apr 16 20:28:19.441197 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.441140 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerDied","Data":"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759"} Apr 16 20:28:19.709172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.709151 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:28:19.806686 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.806657 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location\") pod \"b5c97a4c-51a6-464c-874a-dd2fdab95474\" (UID: \"b5c97a4c-51a6-464c-874a-dd2fdab95474\") " Apr 16 20:28:19.806965 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.806942 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5c97a4c-51a6-464c-874a-dd2fdab95474" (UID: "b5c97a4c-51a6-464c-874a-dd2fdab95474"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:28:19.907361 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:19.907331 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5c97a4c-51a6-464c-874a-dd2fdab95474-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:28:20.451859 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.451829 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" Apr 16 20:28:20.452311 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.451828 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb" event={"ID":"b5c97a4c-51a6-464c-874a-dd2fdab95474","Type":"ContainerDied","Data":"82576c08fa90fefa14590a4028a21966f0affdaa33acf8bac511814983d2767c"} Apr 16 20:28:20.452311 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.451956 2563 scope.go:117] "RemoveContainer" containerID="5f2efefabf4aed3164143d5baf7774a0aa48d9a83d72755ec139df06d6f56827" Apr 16 20:28:20.473815 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.473703 2563 scope.go:117] "RemoveContainer" containerID="0670f67ecc54d8ab13218f2253ddb5ae95429e82e9d03936c9addefd41bf80f9" Apr 16 20:28:20.478882 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.478835 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:28:20.482304 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:20.482258 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-657wb"] Apr 16 20:28:21.304738 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:28:21.302168 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" path="/var/lib/kubelet/pods/b5c97a4c-51a6-464c-874a-dd2fdab95474/volumes" Apr 16 20:30:29.891858 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:30:29.891775 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerStarted","Data":"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58"} Apr 16 20:30:29.892407 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:30:29.891875 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:30:29.918519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:30:29.918380 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" podStartSLOduration=4.905391713 podStartE2EDuration="2m14.918365603s" podCreationTimestamp="2026-04-16 20:28:15 +0000 UTC" firstStartedPulling="2026-04-16 20:28:19.4421484 +0000 UTC m=+996.731074163" lastFinishedPulling="2026-04-16 20:30:29.455122281 +0000 UTC m=+1126.744048053" observedRunningTime="2026-04-16 20:30:29.917537396 +0000 UTC m=+1127.206463180" watchObservedRunningTime="2026-04-16 20:30:29.918365603 +0000 UTC m=+1127.207291390" Apr 16 20:31:00.899264 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:00.899233 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:31:05.258350 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.258278 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:31:05.258732 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.258505 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="kserve-container" containerID="cri-o://2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58" gracePeriod=30 Apr 16 20:31:05.372571 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372526 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:05.372871 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372858 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" Apr 16 20:31:05.372926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372872 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" Apr 16 20:31:05.372926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372884 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="storage-initializer" Apr 16 20:31:05.372926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372891 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="storage-initializer" Apr 16 20:31:05.373026 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.372947 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5c97a4c-51a6-464c-874a-dd2fdab95474" containerName="kserve-container" Apr 16 20:31:05.437835 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.437805 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:05.438000 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.437936 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:05.502615 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.502575 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7\" (UID: \"13d70d02-a06e-4305-8387-33dec3eb5d8b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:05.603817 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.603741 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7\" (UID: \"13d70d02-a06e-4305-8387-33dec3eb5d8b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:05.604118 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.604099 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7\" (UID: \"13d70d02-a06e-4305-8387-33dec3eb5d8b\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:05.749520 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.749489 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:05.879229 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.879160 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:05.882477 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:31:05.882436 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13d70d02_a06e_4305_8387_33dec3eb5d8b.slice/crio-dc966d749aa37894bb41085c2a736b4f223c2deb472606419b51fe7e26a42dad WatchSource:0}: Error finding container dc966d749aa37894bb41085c2a736b4f223c2deb472606419b51fe7e26a42dad: Status 404 returned error can't find the container with id dc966d749aa37894bb41085c2a736b4f223c2deb472606419b51fe7e26a42dad Apr 16 20:31:05.995228 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:05.995188 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerStarted","Data":"dc966d749aa37894bb41085c2a736b4f223c2deb472606419b51fe7e26a42dad"} Apr 16 20:31:06.229765 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:06.229736 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:31:06.309369 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:06.309332 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location\") pod \"12a0452a-d379-4b23-b0f2-bda444ad1266\" (UID: \"12a0452a-d379-4b23-b0f2-bda444ad1266\") " Apr 16 20:31:06.309795 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:06.309641 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "12a0452a-d379-4b23-b0f2-bda444ad1266" (UID: "12a0452a-d379-4b23-b0f2-bda444ad1266"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:06.410062 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:06.409996 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/12a0452a-d379-4b23-b0f2-bda444ad1266-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:31:07.000166 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.000135 2563 generic.go:358] "Generic (PLEG): container finished" podID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerID="2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58" exitCode=0 Apr 16 20:31:07.000353 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.000210 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" Apr 16 20:31:07.000353 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.000217 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerDied","Data":"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58"} Apr 16 20:31:07.000353 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.000276 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq" event={"ID":"12a0452a-d379-4b23-b0f2-bda444ad1266","Type":"ContainerDied","Data":"ac8ecacd8094a489d20108f4d800fbb2e1c5ed14ecba088fe4292d78d6fbf51e"} Apr 16 20:31:07.000353 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.000300 2563 scope.go:117] "RemoveContainer" containerID="2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58" Apr 16 20:31:07.001584 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.001543 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerStarted","Data":"34f7560afb6300ed2e07a940b6313418f1e3ce435dacb3bc7004fbab8cfa66a8"} Apr 16 20:31:07.008660 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.008635 2563 scope.go:117] "RemoveContainer" containerID="f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759" Apr 16 20:31:07.015338 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.015319 2563 scope.go:117] "RemoveContainer" containerID="2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58" Apr 16 20:31:07.015575 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:31:07.015545 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58\": container with ID starting with 2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58 not found: ID does not exist" containerID="2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58" Apr 16 20:31:07.015626 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.015584 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58"} err="failed to get container status \"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58\": rpc error: code = NotFound desc = could not find container \"2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58\": container with ID starting with 2c55eea57c772fe722cce295710497a117f5272ef9fe31e65a1458a9841ece58 not found: ID does not exist" Apr 16 20:31:07.015626 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.015607 2563 scope.go:117] "RemoveContainer" containerID="f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759" Apr 16 20:31:07.015838 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:31:07.015821 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759\": container with ID starting with f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759 not found: ID does not exist" containerID="f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759" Apr 16 20:31:07.015881 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.015847 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759"} err="failed to get container status \"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759\": rpc error: code = NotFound desc = could not find container \"f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759\": container with ID starting with f9d31ebd18706fe277f432a92d0875a269f7b4e2ad6a1072b394ffec1f631759 not found: ID does not exist" Apr 16 20:31:07.032844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.032824 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:31:07.036976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.036958 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-mzsvq"] Apr 16 20:31:07.299476 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:07.299410 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" path="/var/lib/kubelet/pods/12a0452a-d379-4b23-b0f2-bda444ad1266/volumes" Apr 16 20:31:10.013819 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:10.013788 2563 generic.go:358] "Generic (PLEG): container finished" podID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerID="34f7560afb6300ed2e07a940b6313418f1e3ce435dacb3bc7004fbab8cfa66a8" exitCode=0 Apr 16 20:31:10.014199 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:10.013871 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerDied","Data":"34f7560afb6300ed2e07a940b6313418f1e3ce435dacb3bc7004fbab8cfa66a8"} Apr 16 20:31:11.018872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:11.018836 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerStarted","Data":"1fab28868944786660f8f0d4c4617b8da1399e8fd3c9b8eca37494480c96408d"} Apr 16 20:31:11.019311 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:11.019122 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:11.020417 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:11.020380 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:31:11.038120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:11.038077 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" podStartSLOduration=6.038066014 podStartE2EDuration="6.038066014s" podCreationTimestamp="2026-04-16 20:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:11.036640176 +0000 UTC m=+1168.325565962" watchObservedRunningTime="2026-04-16 20:31:11.038066014 +0000 UTC m=+1168.326991799" Apr 16 20:31:12.022771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:12.022733 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 20:31:22.024752 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:22.024713 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:25.405161 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.405123 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:25.405672 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.405468 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" containerID="cri-o://1fab28868944786660f8f0d4c4617b8da1399e8fd3c9b8eca37494480c96408d" gracePeriod=30 Apr 16 20:31:25.468623 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468582 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:31:25.468922 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468910 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="storage-initializer" Apr 16 20:31:25.468970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468924 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="storage-initializer" Apr 16 20:31:25.468970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468942 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="kserve-container" Apr 16 20:31:25.468970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468948 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="kserve-container" Apr 16 20:31:25.469076 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.468999 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="12a0452a-d379-4b23-b0f2-bda444ad1266" containerName="kserve-container" Apr 16 20:31:25.472196 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.472176 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:25.483517 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.483468 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:31:25.577413 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.577370 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk\" (UID: \"6835d1f4-1537-404c-85e3-510cd3751fb2\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:25.677885 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.677844 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk\" (UID: \"6835d1f4-1537-404c-85e3-510cd3751fb2\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:25.678231 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.678207 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk\" (UID: \"6835d1f4-1537-404c-85e3-510cd3751fb2\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:25.785792 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.785729 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:25.919255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:25.919225 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:31:26.071872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.071835 2563 generic.go:358] "Generic (PLEG): container finished" podID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerID="1fab28868944786660f8f0d4c4617b8da1399e8fd3c9b8eca37494480c96408d" exitCode=0 Apr 16 20:31:26.072042 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.071913 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerDied","Data":"1fab28868944786660f8f0d4c4617b8da1399e8fd3c9b8eca37494480c96408d"} Apr 16 20:31:26.073833 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.073628 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerStarted","Data":"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1"} Apr 16 20:31:26.073833 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.073665 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerStarted","Data":"bee57432fa00dd3b225b061dd5e0367f299134ea7ed72e86b227d01e980cff24"} Apr 16 20:31:26.131553 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.131529 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:26.182313 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.182234 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location\") pod \"13d70d02-a06e-4305-8387-33dec3eb5d8b\" (UID: \"13d70d02-a06e-4305-8387-33dec3eb5d8b\") " Apr 16 20:31:26.182542 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.182520 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "13d70d02-a06e-4305-8387-33dec3eb5d8b" (UID: "13d70d02-a06e-4305-8387-33dec3eb5d8b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:31:26.283367 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:26.283333 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/13d70d02-a06e-4305-8387-33dec3eb5d8b-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:31:27.078422 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.078344 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" Apr 16 20:31:27.078422 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.078352 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7" event={"ID":"13d70d02-a06e-4305-8387-33dec3eb5d8b","Type":"ContainerDied","Data":"dc966d749aa37894bb41085c2a736b4f223c2deb472606419b51fe7e26a42dad"} Apr 16 20:31:27.078422 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.078400 2563 scope.go:117] "RemoveContainer" containerID="1fab28868944786660f8f0d4c4617b8da1399e8fd3c9b8eca37494480c96408d" Apr 16 20:31:27.087282 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.087249 2563 scope.go:117] "RemoveContainer" containerID="34f7560afb6300ed2e07a940b6313418f1e3ce435dacb3bc7004fbab8cfa66a8" Apr 16 20:31:27.100630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.100608 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:27.103652 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.103631 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-vwgd7"] Apr 16 20:31:27.298846 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:27.298812 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" path="/var/lib/kubelet/pods/13d70d02-a06e-4305-8387-33dec3eb5d8b/volumes" Apr 16 20:31:30.091484 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:30.091450 2563 generic.go:358] "Generic (PLEG): container finished" podID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerID="d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1" exitCode=0 Apr 16 20:31:30.091881 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:30.091538 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerDied","Data":"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1"} Apr 16 20:31:31.096339 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:31.096301 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerStarted","Data":"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5"} Apr 16 20:31:31.096759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:31.096530 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:31:31.113548 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:31:31.113498 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" podStartSLOduration=6.113483797 podStartE2EDuration="6.113483797s" podCreationTimestamp="2026-04-16 20:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:31:31.111967997 +0000 UTC m=+1188.400893783" watchObservedRunningTime="2026-04-16 20:31:31.113483797 +0000 UTC m=+1188.402409581" Apr 16 20:32:02.105205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:02.105174 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:32:05.584090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.584056 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:32:05.584572 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.584348 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="kserve-container" containerID="cri-o://491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5" gracePeriod=30 Apr 16 20:32:05.639874 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.639831 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:32:05.640282 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.640267 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="storage-initializer" Apr 16 20:32:05.640326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.640286 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="storage-initializer" Apr 16 20:32:05.640326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.640302 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" Apr 16 20:32:05.640326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.640311 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" Apr 16 20:32:05.640444 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.640432 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="13d70d02-a06e-4305-8387-33dec3eb5d8b" containerName="kserve-container" Apr 16 20:32:05.645441 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.645420 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:05.655766 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.655730 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:32:05.708471 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.708426 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-w6z2q\" (UID: \"a1dcd167-72f9-49a1-903d-7f722d12d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:05.808890 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.808847 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-w6z2q\" (UID: \"a1dcd167-72f9-49a1-903d-7f722d12d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:05.809199 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.809182 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-bd659dd86-w6z2q\" (UID: \"a1dcd167-72f9-49a1-903d-7f722d12d37e\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:05.958392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:05.958356 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:06.097865 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.097831 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:32:06.103832 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:32:06.103805 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1dcd167_72f9_49a1_903d_7f722d12d37e.slice/crio-03fe5b1ad310c29f7b5919c9e7a0fdefa4a2d99bb2c35de9450e263d7ca3a079 WatchSource:0}: Error finding container 03fe5b1ad310c29f7b5919c9e7a0fdefa4a2d99bb2c35de9450e263d7ca3a079: Status 404 returned error can't find the container with id 03fe5b1ad310c29f7b5919c9e7a0fdefa4a2d99bb2c35de9450e263d7ca3a079 Apr 16 20:32:06.212610 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.212499 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerStarted","Data":"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055"} Apr 16 20:32:06.212610 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.212548 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerStarted","Data":"03fe5b1ad310c29f7b5919c9e7a0fdefa4a2d99bb2c35de9450e263d7ca3a079"} Apr 16 20:32:06.816310 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.816286 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:32:06.917517 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.917490 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location\") pod \"6835d1f4-1537-404c-85e3-510cd3751fb2\" (UID: \"6835d1f4-1537-404c-85e3-510cd3751fb2\") " Apr 16 20:32:06.917852 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:06.917830 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6835d1f4-1537-404c-85e3-510cd3751fb2" (UID: "6835d1f4-1537-404c-85e3-510cd3751fb2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:32:07.018621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.018555 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6835d1f4-1537-404c-85e3-510cd3751fb2-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:32:07.218449 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.218365 2563 generic.go:358] "Generic (PLEG): container finished" podID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerID="491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5" exitCode=0 Apr 16 20:32:07.218449 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.218433 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" Apr 16 20:32:07.218682 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.218456 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerDied","Data":"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5"} Apr 16 20:32:07.218682 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.218496 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk" event={"ID":"6835d1f4-1537-404c-85e3-510cd3751fb2","Type":"ContainerDied","Data":"bee57432fa00dd3b225b061dd5e0367f299134ea7ed72e86b227d01e980cff24"} Apr 16 20:32:07.218682 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.218514 2563 scope.go:117] "RemoveContainer" containerID="491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5" Apr 16 20:32:07.228479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.228451 2563 scope.go:117] "RemoveContainer" containerID="d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1" Apr 16 20:32:07.235513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.235496 2563 scope.go:117] "RemoveContainer" containerID="491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5" Apr 16 20:32:07.235759 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:32:07.235739 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5\": container with ID starting with 491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5 not found: ID does not exist" containerID="491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5" Apr 16 20:32:07.235808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.235768 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5"} err="failed to get container status \"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5\": rpc error: code = NotFound desc = could not find container \"491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5\": container with ID starting with 491716078d6171d41638bfc4d549c577655cc3bd5512740aa7a7f37d5c03daf5 not found: ID does not exist" Apr 16 20:32:07.235808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.235785 2563 scope.go:117] "RemoveContainer" containerID="d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1" Apr 16 20:32:07.236026 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:32:07.236008 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1\": container with ID starting with d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1 not found: ID does not exist" containerID="d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1" Apr 16 20:32:07.236078 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.236031 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1"} err="failed to get container status \"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1\": rpc error: code = NotFound desc = could not find container \"d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1\": container with ID starting with d80c76347a64f5e757ff01fec9f072a7d1995ce58c5b87f5c99fb6195feedda1 not found: ID does not exist" Apr 16 20:32:07.240726 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.240704 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:32:07.242897 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.242876 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-fwxhk"] Apr 16 20:32:07.299205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:07.299175 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" path="/var/lib/kubelet/pods/6835d1f4-1537-404c-85e3-510cd3751fb2/volumes" Apr 16 20:32:10.232574 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:10.232536 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerID="e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055" exitCode=0 Apr 16 20:32:10.232931 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:10.232609 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerDied","Data":"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055"} Apr 16 20:32:11.240134 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:11.239978 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerStarted","Data":"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad"} Apr 16 20:32:13.249440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:13.249405 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerStarted","Data":"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817"} Apr 16 20:32:13.249939 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:13.249554 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:13.270824 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:13.270765 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podStartSLOduration=6.079848424 podStartE2EDuration="8.270749629s" podCreationTimestamp="2026-04-16 20:32:05 +0000 UTC" firstStartedPulling="2026-04-16 20:32:10.29345473 +0000 UTC m=+1227.582380493" lastFinishedPulling="2026-04-16 20:32:12.48435593 +0000 UTC m=+1229.773281698" observedRunningTime="2026-04-16 20:32:13.269219232 +0000 UTC m=+1230.558145053" watchObservedRunningTime="2026-04-16 20:32:13.270749629 +0000 UTC m=+1230.559675415" Apr 16 20:32:14.253221 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:14.253192 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:32:45.259139 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:32:45.259103 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:33:15.260167 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.260126 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:33:15.744777 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.744730 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:33:15.745131 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.745075 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" containerID="cri-o://8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad" gracePeriod=30 Apr 16 20:33:15.745290 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.745121 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-agent" containerID="cri-o://9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817" gracePeriod=30 Apr 16 20:33:15.801667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.801626 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:33:15.802036 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.802018 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="storage-initializer" Apr 16 20:33:15.802120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.802039 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="storage-initializer" Apr 16 20:33:15.802120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.802055 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="kserve-container" Apr 16 20:33:15.802120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.802064 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="kserve-container" Apr 16 20:33:15.802268 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.802146 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="6835d1f4-1537-404c-85e3-510cd3751fb2" containerName="kserve-container" Apr 16 20:33:15.805254 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.805232 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:15.814680 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.814655 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:33:15.887076 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.887025 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-n5c9n\" (UID: \"81bf13f0-206b-4436-95da-35ffd9a712dc\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:15.988215 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.988156 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-n5c9n\" (UID: \"81bf13f0-206b-4436-95da-35ffd9a712dc\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:15.988594 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:15.988552 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-n5c9n\" (UID: \"81bf13f0-206b-4436-95da-35ffd9a712dc\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:16.116696 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:16.116584 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:16.248813 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:16.248774 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:33:16.252218 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:33:16.252182 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81bf13f0_206b_4436_95da_35ffd9a712dc.slice/crio-ddceaf1b09a5bde184334bf0c777cc0f9b349c5aa0ae7c1636c3342e9b7c6c6a WatchSource:0}: Error finding container ddceaf1b09a5bde184334bf0c777cc0f9b349c5aa0ae7c1636c3342e9b7c6c6a: Status 404 returned error can't find the container with id ddceaf1b09a5bde184334bf0c777cc0f9b349c5aa0ae7c1636c3342e9b7c6c6a Apr 16 20:33:16.254280 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:16.254260 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:33:16.451474 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:16.451436 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerStarted","Data":"67c2d5a2c5046a8b3a4de976294eadb500ba5518a411f64980756aec6e754e82"} Apr 16 20:33:16.451946 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:16.451481 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerStarted","Data":"ddceaf1b09a5bde184334bf0c777cc0f9b349c5aa0ae7c1636c3342e9b7c6c6a"} Apr 16 20:33:18.460536 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:18.460502 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerID="8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad" exitCode=0 Apr 16 20:33:18.460929 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:18.460584 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerDied","Data":"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad"} Apr 16 20:33:22.475398 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:22.475365 2563 generic.go:358] "Generic (PLEG): container finished" podID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerID="67c2d5a2c5046a8b3a4de976294eadb500ba5518a411f64980756aec6e754e82" exitCode=0 Apr 16 20:33:22.475791 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:22.475434 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerDied","Data":"67c2d5a2c5046a8b3a4de976294eadb500ba5518a411f64980756aec6e754e82"} Apr 16 20:33:25.256519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:25.256458 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.36:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:33:34.528348 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:34.528310 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerStarted","Data":"986ebd670b3b642492ceef3690631776f5b0fc251ac57d689abf238cb7cb9972"} Apr 16 20:33:34.528861 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:34.528655 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:33:34.529836 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:34.529810 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:33:34.546884 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:34.546831 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podStartSLOduration=8.340752886 podStartE2EDuration="19.546820147s" podCreationTimestamp="2026-04-16 20:33:15 +0000 UTC" firstStartedPulling="2026-04-16 20:33:22.476584028 +0000 UTC m=+1299.765509791" lastFinishedPulling="2026-04-16 20:33:33.682651285 +0000 UTC m=+1310.971577052" observedRunningTime="2026-04-16 20:33:34.545285724 +0000 UTC m=+1311.834211509" watchObservedRunningTime="2026-04-16 20:33:34.546820147 +0000 UTC m=+1311.835745932" Apr 16 20:33:35.256742 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:35.256702 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.36:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:33:35.532346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:35.532266 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:33:45.256368 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:45.256330 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.36:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 20:33:45.256744 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:45.256461 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:33:45.532874 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:45.532781 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:33:46.385916 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.385894 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:33:46.532103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.532020 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location\") pod \"a1dcd167-72f9-49a1-903d-7f722d12d37e\" (UID: \"a1dcd167-72f9-49a1-903d-7f722d12d37e\") " Apr 16 20:33:46.532320 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.532294 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a1dcd167-72f9-49a1-903d-7f722d12d37e" (UID: "a1dcd167-72f9-49a1-903d-7f722d12d37e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:33:46.572319 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.572291 2563 generic.go:358] "Generic (PLEG): container finished" podID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerID="9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817" exitCode=0 Apr 16 20:33:46.572452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.572341 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerDied","Data":"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817"} Apr 16 20:33:46.572452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.572366 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" event={"ID":"a1dcd167-72f9-49a1-903d-7f722d12d37e","Type":"ContainerDied","Data":"03fe5b1ad310c29f7b5919c9e7a0fdefa4a2d99bb2c35de9450e263d7ca3a079"} Apr 16 20:33:46.572452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.572381 2563 scope.go:117] "RemoveContainer" containerID="9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817" Apr 16 20:33:46.572452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.572381 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q" Apr 16 20:33:46.580146 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.579998 2563 scope.go:117] "RemoveContainer" containerID="8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad" Apr 16 20:33:46.587048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.587033 2563 scope.go:117] "RemoveContainer" containerID="e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055" Apr 16 20:33:46.594236 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594217 2563 scope.go:117] "RemoveContainer" containerID="9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817" Apr 16 20:33:46.594464 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:33:46.594446 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817\": container with ID starting with 9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817 not found: ID does not exist" containerID="9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817" Apr 16 20:33:46.594524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594472 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817"} err="failed to get container status \"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817\": rpc error: code = NotFound desc = could not find container \"9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817\": container with ID starting with 9798125e9be5e74e972408eb1dad20ac09817e4e540213de97a6f19412dcf817 not found: ID does not exist" Apr 16 20:33:46.594524 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594487 2563 scope.go:117] "RemoveContainer" containerID="8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad" Apr 16 20:33:46.594750 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:33:46.594733 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad\": container with ID starting with 8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad not found: ID does not exist" containerID="8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad" Apr 16 20:33:46.594793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594757 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad"} err="failed to get container status \"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad\": rpc error: code = NotFound desc = could not find container \"8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad\": container with ID starting with 8ecdd3b15d7ad0f2a2d079bcd3f1b5494f2ec8a9e706f5d277b8550a3c167aad not found: ID does not exist" Apr 16 20:33:46.594793 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594773 2563 scope.go:117] "RemoveContainer" containerID="e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055" Apr 16 20:33:46.594981 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:33:46.594968 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055\": container with ID starting with e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055 not found: ID does not exist" containerID="e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055" Apr 16 20:33:46.595022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.594983 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055"} err="failed to get container status \"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055\": rpc error: code = NotFound desc = could not find container \"e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055\": container with ID starting with e8006f996d0572aeb01ffdc9ed42de22a82047697c948e03b9d7c56a81778055 not found: ID does not exist" Apr 16 20:33:46.595022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.595009 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:33:46.597534 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.597515 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-bd659dd86-w6z2q"] Apr 16 20:33:46.632444 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:46.632422 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a1dcd167-72f9-49a1-903d-7f722d12d37e-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:33:47.298810 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:47.298775 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" path="/var/lib/kubelet/pods/a1dcd167-72f9-49a1-903d-7f722d12d37e/volumes" Apr 16 20:33:55.532704 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:33:55.532659 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:34:05.532281 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:05.532200 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 20:34:15.533738 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:15.533707 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:34:17.292747 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.292713 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:34:17.293092 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.292993 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" containerID="cri-o://986ebd670b3b642492ceef3690631776f5b0fc251ac57d689abf238cb7cb9972" gracePeriod=30 Apr 16 20:34:17.375045 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375010 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:34:17.375377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375362 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" Apr 16 20:34:17.375438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375379 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" Apr 16 20:34:17.375438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375396 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="storage-initializer" Apr 16 20:34:17.375438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375401 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="storage-initializer" Apr 16 20:34:17.375438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375418 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-agent" Apr 16 20:34:17.375438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375423 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-agent" Apr 16 20:34:17.375623 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375470 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-agent" Apr 16 20:34:17.375623 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.375481 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a1dcd167-72f9-49a1-903d-7f722d12d37e" containerName="kserve-container" Apr 16 20:34:17.378211 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.378189 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:17.390247 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.390226 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:34:17.574950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.574868 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvjk9\" (UID: \"4a3c558c-e9f0-4052-950c-52ffe9064777\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:17.675858 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.675831 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvjk9\" (UID: \"4a3c558c-e9f0-4052-950c-52ffe9064777\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:17.676131 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.676115 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-zvjk9\" (UID: \"4a3c558c-e9f0-4052-950c-52ffe9064777\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:17.688400 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.688381 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:17.807942 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:17.807847 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:34:17.810936 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:34:17.810907 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a3c558c_e9f0_4052_950c_52ffe9064777.slice/crio-fcd5f6adf8a12febf149acdc4ebbc98a78e9cc4a83333e1635ea03b795ffd073 WatchSource:0}: Error finding container fcd5f6adf8a12febf149acdc4ebbc98a78e9cc4a83333e1635ea03b795ffd073: Status 404 returned error can't find the container with id fcd5f6adf8a12febf149acdc4ebbc98a78e9cc4a83333e1635ea03b795ffd073 Apr 16 20:34:18.680792 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:18.680752 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerStarted","Data":"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a"} Apr 16 20:34:18.680792 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:18.680794 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerStarted","Data":"fcd5f6adf8a12febf149acdc4ebbc98a78e9cc4a83333e1635ea03b795ffd073"} Apr 16 20:34:19.686555 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.686526 2563 generic.go:358] "Generic (PLEG): container finished" podID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerID="986ebd670b3b642492ceef3690631776f5b0fc251ac57d689abf238cb7cb9972" exitCode=0 Apr 16 20:34:19.686912 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.686602 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerDied","Data":"986ebd670b3b642492ceef3690631776f5b0fc251ac57d689abf238cb7cb9972"} Apr 16 20:34:19.733151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.733130 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:34:19.792671 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.792648 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location\") pod \"81bf13f0-206b-4436-95da-35ffd9a712dc\" (UID: \"81bf13f0-206b-4436-95da-35ffd9a712dc\") " Apr 16 20:34:19.802166 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.802114 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "81bf13f0-206b-4436-95da-35ffd9a712dc" (UID: "81bf13f0-206b-4436-95da-35ffd9a712dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:34:19.893273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:19.893249 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/81bf13f0-206b-4436-95da-35ffd9a712dc-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:34:20.691468 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.691427 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" event={"ID":"81bf13f0-206b-4436-95da-35ffd9a712dc","Type":"ContainerDied","Data":"ddceaf1b09a5bde184334bf0c777cc0f9b349c5aa0ae7c1636c3342e9b7c6c6a"} Apr 16 20:34:20.691468 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.691469 2563 scope.go:117] "RemoveContainer" containerID="986ebd670b3b642492ceef3690631776f5b0fc251ac57d689abf238cb7cb9972" Apr 16 20:34:20.691964 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.691477 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n" Apr 16 20:34:20.699809 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.699791 2563 scope.go:117] "RemoveContainer" containerID="67c2d5a2c5046a8b3a4de976294eadb500ba5518a411f64980756aec6e754e82" Apr 16 20:34:20.712548 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.712526 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:34:20.715787 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:20.715766 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-n5c9n"] Apr 16 20:34:21.299044 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:21.298978 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" path="/var/lib/kubelet/pods/81bf13f0-206b-4436-95da-35ffd9a712dc/volumes" Apr 16 20:34:22.701252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:22.701220 2563 generic.go:358] "Generic (PLEG): container finished" podID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerID="891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a" exitCode=0 Apr 16 20:34:22.701720 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:22.701293 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerDied","Data":"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a"} Apr 16 20:34:23.706446 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:23.706412 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerStarted","Data":"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166"} Apr 16 20:34:23.706860 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:23.706706 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:34:23.707965 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:23.707942 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:34:23.725380 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:23.725327 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podStartSLOduration=6.725315147 podStartE2EDuration="6.725315147s" podCreationTimestamp="2026-04-16 20:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:34:23.723814836 +0000 UTC m=+1361.012740620" watchObservedRunningTime="2026-04-16 20:34:23.725315147 +0000 UTC m=+1361.014240932" Apr 16 20:34:24.710127 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:24.710091 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:34:34.711087 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:34.711043 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:34:44.710193 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:44.710153 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:34:54.710674 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:34:54.710633 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 20:35:04.711103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:04.711069 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:35:08.872682 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.872650 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:35:08.873073 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.872871 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" containerID="cri-o://0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166" gracePeriod=30 Apr 16 20:35:08.946318 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946286 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:35:08.946760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946746 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="storage-initializer" Apr 16 20:35:08.946818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946764 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="storage-initializer" Apr 16 20:35:08.946818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946785 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" Apr 16 20:35:08.946818 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946794 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" Apr 16 20:35:08.946911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.946879 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="81bf13f0-206b-4436-95da-35ffd9a712dc" containerName="kserve-container" Apr 16 20:35:08.951618 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.951599 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:08.956984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:08.956963 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:35:09.043925 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.043901 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb\" (UID: \"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:09.145428 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.145353 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb\" (UID: \"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:09.145734 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.145713 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb\" (UID: \"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:09.262901 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.262876 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:09.383297 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.383270 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:35:09.385687 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:35:09.385658 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad58b2c8_a8c3_4d2e_8afb_1531b1eaaf06.slice/crio-e5de8a7b906582a5fcdbb49b90dfe72a858f3650d9e382256cb9c85847136067 WatchSource:0}: Error finding container e5de8a7b906582a5fcdbb49b90dfe72a858f3650d9e382256cb9c85847136067: Status 404 returned error can't find the container with id e5de8a7b906582a5fcdbb49b90dfe72a858f3650d9e382256cb9c85847136067 Apr 16 20:35:09.854976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.854941 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerStarted","Data":"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e"} Apr 16 20:35:09.854976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:09.854980 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerStarted","Data":"e5de8a7b906582a5fcdbb49b90dfe72a858f3650d9e382256cb9c85847136067"} Apr 16 20:35:11.316233 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.316204 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:35:11.464167 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.464145 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location\") pod \"4a3c558c-e9f0-4052-950c-52ffe9064777\" (UID: \"4a3c558c-e9f0-4052-950c-52ffe9064777\") " Apr 16 20:35:11.473728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.473702 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a3c558c-e9f0-4052-950c-52ffe9064777" (UID: "4a3c558c-e9f0-4052-950c-52ffe9064777"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:35:11.564874 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.564848 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a3c558c-e9f0-4052-950c-52ffe9064777-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:35:11.865125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.865049 2563 generic.go:358] "Generic (PLEG): container finished" podID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerID="0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166" exitCode=0 Apr 16 20:35:11.865125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.865104 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerDied","Data":"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166"} Apr 16 20:35:11.865125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.865114 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" Apr 16 20:35:11.865326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.865132 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9" event={"ID":"4a3c558c-e9f0-4052-950c-52ffe9064777","Type":"ContainerDied","Data":"fcd5f6adf8a12febf149acdc4ebbc98a78e9cc4a83333e1635ea03b795ffd073"} Apr 16 20:35:11.865326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.865146 2563 scope.go:117] "RemoveContainer" containerID="0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166" Apr 16 20:35:11.873637 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.873620 2563 scope.go:117] "RemoveContainer" containerID="891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a" Apr 16 20:35:11.880435 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.880419 2563 scope.go:117] "RemoveContainer" containerID="0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166" Apr 16 20:35:11.880698 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:35:11.880680 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166\": container with ID starting with 0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166 not found: ID does not exist" containerID="0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166" Apr 16 20:35:11.880747 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.880708 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166"} err="failed to get container status \"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166\": rpc error: code = NotFound desc = could not find container \"0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166\": container with ID starting with 0d9b240189730367c03acd70800ee70ffcf8a06222790f77cc6ec4f8049e7166 not found: ID does not exist" Apr 16 20:35:11.880747 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.880726 2563 scope.go:117] "RemoveContainer" containerID="891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a" Apr 16 20:35:11.880927 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:35:11.880912 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a\": container with ID starting with 891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a not found: ID does not exist" containerID="891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a" Apr 16 20:35:11.880968 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.880933 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a"} err="failed to get container status \"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a\": rpc error: code = NotFound desc = could not find container \"891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a\": container with ID starting with 891c0aea8d3375117bf03309dd79f138b4181527677b2d0f95d1a6a647f5640a not found: ID does not exist" Apr 16 20:35:11.887938 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.887915 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:35:11.890866 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:11.890847 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-zvjk9"] Apr 16 20:35:13.298811 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:13.298752 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" path="/var/lib/kubelet/pods/4a3c558c-e9f0-4052-950c-52ffe9064777/volumes" Apr 16 20:35:13.874156 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:13.874072 2563 generic.go:358] "Generic (PLEG): container finished" podID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerID="849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e" exitCode=0 Apr 16 20:35:13.874156 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:13.874144 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerDied","Data":"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e"} Apr 16 20:35:14.878451 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:14.878422 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerStarted","Data":"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a"} Apr 16 20:35:14.878832 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:14.878741 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:35:14.879795 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:14.879768 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:35:14.895111 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:14.895063 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podStartSLOduration=6.895048995 podStartE2EDuration="6.895048995s" podCreationTimestamp="2026-04-16 20:35:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:35:14.894652162 +0000 UTC m=+1412.183577947" watchObservedRunningTime="2026-04-16 20:35:14.895048995 +0000 UTC m=+1412.183974779" Apr 16 20:35:15.882429 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:15.882392 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:35:25.882635 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:25.882590 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:35:35.883273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:35.883192 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:35:45.883088 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:45.883042 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 20:35:55.883733 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:35:55.883706 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:36:00.597501 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.597465 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:36:00.598006 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.597721 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" containerID="cri-o://ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a" gracePeriod=30 Apr 16 20:36:00.673110 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673075 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:36:00.673402 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673390 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" Apr 16 20:36:00.673458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673404 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" Apr 16 20:36:00.673458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673432 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="storage-initializer" Apr 16 20:36:00.673458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673438 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="storage-initializer" Apr 16 20:36:00.673577 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.673495 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a3c558c-e9f0-4052-950c-52ffe9064777" containerName="kserve-container" Apr 16 20:36:00.676527 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.676509 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:00.684097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.683745 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:36:00.726465 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.726439 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7jbch\" (UID: \"066b2ead-92b6-4c81-a1b9-4a336483a9e3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:00.827067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.827033 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7jbch\" (UID: \"066b2ead-92b6-4c81-a1b9-4a336483a9e3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:00.827378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.827358 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7jbch\" (UID: \"066b2ead-92b6-4c81-a1b9-4a336483a9e3\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:00.988283 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:00.988249 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:01.114758 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:01.114719 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:36:01.118121 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:36:01.118090 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod066b2ead_92b6_4c81_a1b9_4a336483a9e3.slice/crio-2ea4acf08814a8ea47494613cfecac9e2dd2faa59235f7167e540fb7ef5e107c WatchSource:0}: Error finding container 2ea4acf08814a8ea47494613cfecac9e2dd2faa59235f7167e540fb7ef5e107c: Status 404 returned error can't find the container with id 2ea4acf08814a8ea47494613cfecac9e2dd2faa59235f7167e540fb7ef5e107c Apr 16 20:36:02.020002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:02.019965 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerStarted","Data":"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064"} Apr 16 20:36:02.020377 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:02.020012 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerStarted","Data":"2ea4acf08814a8ea47494613cfecac9e2dd2faa59235f7167e540fb7ef5e107c"} Apr 16 20:36:03.239321 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:03.239297 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:36:03.351521 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:03.351445 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location\") pod \"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06\" (UID: \"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06\") " Apr 16 20:36:03.360877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:03.360851 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" (UID: "ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:36:03.452517 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:03.452484 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:36:04.029050 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.029014 2563 generic.go:358] "Generic (PLEG): container finished" podID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerID="ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a" exitCode=0 Apr 16 20:36:04.029216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.029080 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" Apr 16 20:36:04.029216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.029096 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerDied","Data":"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a"} Apr 16 20:36:04.029216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.029132 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb" event={"ID":"ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06","Type":"ContainerDied","Data":"e5de8a7b906582a5fcdbb49b90dfe72a858f3650d9e382256cb9c85847136067"} Apr 16 20:36:04.029216 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.029146 2563 scope.go:117] "RemoveContainer" containerID="ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a" Apr 16 20:36:04.037176 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.037162 2563 scope.go:117] "RemoveContainer" containerID="849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e" Apr 16 20:36:04.043889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.043872 2563 scope.go:117] "RemoveContainer" containerID="ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a" Apr 16 20:36:04.044127 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:36:04.044110 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a\": container with ID starting with ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a not found: ID does not exist" containerID="ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a" Apr 16 20:36:04.044172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.044135 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a"} err="failed to get container status \"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a\": rpc error: code = NotFound desc = could not find container \"ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a\": container with ID starting with ed9cfe89dff4350759954b2b84569d3301b5325873d21f61b8e7a10a4ff0a97a not found: ID does not exist" Apr 16 20:36:04.044172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.044151 2563 scope.go:117] "RemoveContainer" containerID="849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e" Apr 16 20:36:04.044352 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:36:04.044338 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e\": container with ID starting with 849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e not found: ID does not exist" containerID="849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e" Apr 16 20:36:04.044388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.044354 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e"} err="failed to get container status \"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e\": rpc error: code = NotFound desc = could not find container \"849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e\": container with ID starting with 849a5b4ecf44e9647a7c239f673b01e156d0735e5a19b81dfb171131585bfb1e not found: ID does not exist" Apr 16 20:36:04.051250 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.051228 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:36:04.056161 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:04.056133 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-67fsb"] Apr 16 20:36:05.299154 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:05.299123 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" path="/var/lib/kubelet/pods/ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06/volumes" Apr 16 20:36:06.038424 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:06.038391 2563 generic.go:358] "Generic (PLEG): container finished" podID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerID="940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064" exitCode=0 Apr 16 20:36:06.038606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:06.038451 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerDied","Data":"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064"} Apr 16 20:36:13.069215 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:13.069132 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerStarted","Data":"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0"} Apr 16 20:36:13.069634 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:13.069452 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:36:13.070780 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:13.070756 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:36:13.092246 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:13.092199 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podStartSLOduration=6.377765443 podStartE2EDuration="13.092187767s" podCreationTimestamp="2026-04-16 20:36:00 +0000 UTC" firstStartedPulling="2026-04-16 20:36:06.039643023 +0000 UTC m=+1463.328568785" lastFinishedPulling="2026-04-16 20:36:12.754065332 +0000 UTC m=+1470.042991109" observedRunningTime="2026-04-16 20:36:13.090829841 +0000 UTC m=+1470.379755627" watchObservedRunningTime="2026-04-16 20:36:13.092187767 +0000 UTC m=+1470.381113552" Apr 16 20:36:14.072388 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:14.072348 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:36:24.073200 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:24.073157 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:36:34.073371 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:34.073326 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:36:44.073314 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:44.073270 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:36:54.072747 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:36:54.072704 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:37:04.073415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:04.073331 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:37:14.072975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:14.072931 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:37:24.072830 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:24.072783 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 20:37:31.298428 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:31.298398 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:37:41.799935 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.799904 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:37:41.800373 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.800169 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" containerID="cri-o://1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0" gracePeriod=30 Apr 16 20:37:41.887404 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887375 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:37:41.887743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887729 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" Apr 16 20:37:41.887798 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887745 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" Apr 16 20:37:41.887798 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887756 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="storage-initializer" Apr 16 20:37:41.887798 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887761 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="storage-initializer" Apr 16 20:37:41.887892 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.887812 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad58b2c8-a8c3-4d2e-8afb-1531b1eaaf06" containerName="kserve-container" Apr 16 20:37:41.890724 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.890709 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:41.899045 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.899024 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:37:41.978095 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:41.978066 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-hwj7m\" (UID: \"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:42.079219 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:42.079150 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-hwj7m\" (UID: \"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:42.079615 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:42.079595 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-hwj7m\" (UID: \"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:42.202318 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:42.202295 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:42.326118 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:42.326083 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:37:42.330597 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:37:42.330511 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8ce21e5_3f56_4ac7_8e2f_ed90d47729e7.slice/crio-54027331e3e9727438be354a28f13335f8a6eb6f36d42d149224eabddc614158 WatchSource:0}: Error finding container 54027331e3e9727438be354a28f13335f8a6eb6f36d42d149224eabddc614158: Status 404 returned error can't find the container with id 54027331e3e9727438be354a28f13335f8a6eb6f36d42d149224eabddc614158 Apr 16 20:37:42.342609 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:42.342586 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerStarted","Data":"54027331e3e9727438be354a28f13335f8a6eb6f36d42d149224eabddc614158"} Apr 16 20:37:43.346721 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:43.346689 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerStarted","Data":"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882"} Apr 16 20:37:45.040202 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.040179 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:37:45.100804 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.100740 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location\") pod \"066b2ead-92b6-4c81-a1b9-4a336483a9e3\" (UID: \"066b2ead-92b6-4c81-a1b9-4a336483a9e3\") " Apr 16 20:37:45.101038 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.101016 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "066b2ead-92b6-4c81-a1b9-4a336483a9e3" (UID: "066b2ead-92b6-4c81-a1b9-4a336483a9e3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:37:45.201678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.201656 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/066b2ead-92b6-4c81-a1b9-4a336483a9e3-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:37:45.354783 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.354728 2563 generic.go:358] "Generic (PLEG): container finished" podID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerID="1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0" exitCode=0 Apr 16 20:37:45.354889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.354795 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" Apr 16 20:37:45.354889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.354814 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerDied","Data":"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0"} Apr 16 20:37:45.354889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.354858 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch" event={"ID":"066b2ead-92b6-4c81-a1b9-4a336483a9e3","Type":"ContainerDied","Data":"2ea4acf08814a8ea47494613cfecac9e2dd2faa59235f7167e540fb7ef5e107c"} Apr 16 20:37:45.354889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.354878 2563 scope.go:117] "RemoveContainer" containerID="1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0" Apr 16 20:37:45.362588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.362551 2563 scope.go:117] "RemoveContainer" containerID="940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064" Apr 16 20:37:45.369215 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.369199 2563 scope.go:117] "RemoveContainer" containerID="1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0" Apr 16 20:37:45.369490 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:37:45.369466 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0\": container with ID starting with 1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0 not found: ID does not exist" containerID="1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0" Apr 16 20:37:45.369603 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.369500 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0"} err="failed to get container status \"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0\": rpc error: code = NotFound desc = could not find container \"1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0\": container with ID starting with 1cdcf53d63009269e0941c5fe78026dd9cfe252989bfcb4568659f9aab9780b0 not found: ID does not exist" Apr 16 20:37:45.369603 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.369523 2563 scope.go:117] "RemoveContainer" containerID="940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064" Apr 16 20:37:45.369814 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:37:45.369796 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064\": container with ID starting with 940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064 not found: ID does not exist" containerID="940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064" Apr 16 20:37:45.369879 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.369822 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064"} err="failed to get container status \"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064\": rpc error: code = NotFound desc = could not find container \"940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064\": container with ID starting with 940b8efa424f7493fb16b8529f1332b50ca3c83aa9bdc2e3f5155056c17e0064 not found: ID does not exist" Apr 16 20:37:45.371124 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.371105 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:37:45.374586 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:45.374551 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7jbch"] Apr 16 20:37:46.360156 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:46.360120 2563 generic.go:358] "Generic (PLEG): container finished" podID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerID="3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882" exitCode=0 Apr 16 20:37:46.360482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:46.360196 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerDied","Data":"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882"} Apr 16 20:37:47.298610 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:47.298578 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" path="/var/lib/kubelet/pods/066b2ead-92b6-4c81-a1b9-4a336483a9e3/volumes" Apr 16 20:37:47.363988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:47.363958 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerStarted","Data":"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6"} Apr 16 20:37:47.364420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:47.364243 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:37:47.365207 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:47.365182 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:37:48.367522 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:48.367483 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:37:58.367664 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:37:58.367618 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:08.368300 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:08.368256 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:18.367973 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:18.367933 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:28.368168 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:28.368125 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:38.368407 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:38.368317 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:48.368098 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:48.368055 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:38:57.294930 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:38:57.294887 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.41:8080: connect: connection refused" Apr 16 20:39:07.298844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:07.298816 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:39:07.316363 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:07.316319 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podStartSLOduration=86.316306208 podStartE2EDuration="1m26.316306208s" podCreationTimestamp="2026-04-16 20:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:37:47.396996372 +0000 UTC m=+1564.685922157" watchObservedRunningTime="2026-04-16 20:39:07.316306208 +0000 UTC m=+1644.605231992" Apr 16 20:39:12.897858 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.897826 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:39:12.898219 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.898077 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" containerID="cri-o://4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6" gracePeriod=30 Apr 16 20:39:12.975502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975469 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:39:12.975821 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975808 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="storage-initializer" Apr 16 20:39:12.975868 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975823 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="storage-initializer" Apr 16 20:39:12.975868 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975831 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" Apr 16 20:39:12.975868 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975836 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" Apr 16 20:39:12.975958 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.975894 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="066b2ead-92b6-4c81-a1b9-4a336483a9e3" containerName="kserve-container" Apr 16 20:39:12.978836 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.978818 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:12.985865 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:12.985841 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:39:13.017859 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.017838 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4\" (UID: \"bc96af47-db7c-479d-a08c-df1f951cb0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:13.119012 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.118988 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4\" (UID: \"bc96af47-db7c-479d-a08c-df1f951cb0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:13.119286 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.119271 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4\" (UID: \"bc96af47-db7c-479d-a08c-df1f951cb0b9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:13.290972 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.290944 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:13.411007 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.410982 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:39:13.412759 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:39:13.412734 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc96af47_db7c_479d_a08c_df1f951cb0b9.slice/crio-dd353cb98b2df6fa95d45d77d88c0b05c21e98dfc46900819823b7a4228bd24e WatchSource:0}: Error finding container dd353cb98b2df6fa95d45d77d88c0b05c21e98dfc46900819823b7a4228bd24e: Status 404 returned error can't find the container with id dd353cb98b2df6fa95d45d77d88c0b05c21e98dfc46900819823b7a4228bd24e Apr 16 20:39:13.414510 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.414493 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:39:13.631454 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.631374 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerStarted","Data":"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00"} Apr 16 20:39:13.631454 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:13.631418 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerStarted","Data":"dd353cb98b2df6fa95d45d77d88c0b05c21e98dfc46900819823b7a4228bd24e"} Apr 16 20:39:16.034709 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.034685 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:39:16.139195 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.139132 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location\") pod \"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7\" (UID: \"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7\") " Apr 16 20:39:16.139421 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.139399 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" (UID: "e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:39:16.240120 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.240096 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:39:16.643064 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.643035 2563 generic.go:358] "Generic (PLEG): container finished" podID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerID="4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6" exitCode=0 Apr 16 20:39:16.643221 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.643099 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" Apr 16 20:39:16.643221 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.643123 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerDied","Data":"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6"} Apr 16 20:39:16.643221 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.643166 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m" event={"ID":"e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7","Type":"ContainerDied","Data":"54027331e3e9727438be354a28f13335f8a6eb6f36d42d149224eabddc614158"} Apr 16 20:39:16.643221 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.643183 2563 scope.go:117] "RemoveContainer" containerID="4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6" Apr 16 20:39:16.654732 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.654664 2563 scope.go:117] "RemoveContainer" containerID="3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882" Apr 16 20:39:16.661301 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.661285 2563 scope.go:117] "RemoveContainer" containerID="4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6" Apr 16 20:39:16.661512 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:39:16.661495 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6\": container with ID starting with 4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6 not found: ID does not exist" containerID="4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6" Apr 16 20:39:16.661574 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.661518 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6"} err="failed to get container status \"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6\": rpc error: code = NotFound desc = could not find container \"4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6\": container with ID starting with 4eb387b8b8f4f44714802ac2866bd788e9456547ec02b43d9a926b8a089d0cf6 not found: ID does not exist" Apr 16 20:39:16.661574 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.661534 2563 scope.go:117] "RemoveContainer" containerID="3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882" Apr 16 20:39:16.661779 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:39:16.661763 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882\": container with ID starting with 3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882 not found: ID does not exist" containerID="3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882" Apr 16 20:39:16.661823 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.661785 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882"} err="failed to get container status \"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882\": rpc error: code = NotFound desc = could not find container \"3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882\": container with ID starting with 3f20248406fe2f59cf546468f6649651acf089babfe314cc7c20f298cef33882 not found: ID does not exist" Apr 16 20:39:16.667713 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.667692 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:39:16.670708 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:16.670689 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-hwj7m"] Apr 16 20:39:17.298692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:17.298661 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" path="/var/lib/kubelet/pods/e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7/volumes" Apr 16 20:39:17.647872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:17.647842 2563 generic.go:358] "Generic (PLEG): container finished" podID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerID="4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00" exitCode=0 Apr 16 20:39:17.648057 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:17.647918 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerDied","Data":"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00"} Apr 16 20:39:18.652621 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:18.652589 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerStarted","Data":"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27"} Apr 16 20:39:18.653034 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:18.652865 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:39:18.654057 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:18.654031 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:39:18.669519 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:18.669480 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podStartSLOduration=6.669467644 podStartE2EDuration="6.669467644s" podCreationTimestamp="2026-04-16 20:39:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:39:18.667359613 +0000 UTC m=+1655.956285398" watchObservedRunningTime="2026-04-16 20:39:18.669467644 +0000 UTC m=+1655.958393428" Apr 16 20:39:19.655604 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:19.655542 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:39:29.656346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:29.656305 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:39:39.656140 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:39.656098 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:39:49.656440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:49.656401 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:39:59.655889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:39:59.655846 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:40:09.656700 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:09.656591 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:40:19.656081 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:19.656035 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:40:29.656314 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:29.656269 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.42:8080: connect: connection refused" Apr 16 20:40:39.656515 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:39.656478 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:40:44.130705 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.130674 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:40:44.131041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.130929 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" containerID="cri-o://15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27" gracePeriod=30 Apr 16 20:40:44.209655 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.209623 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:40:44.209997 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.209981 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="storage-initializer" Apr 16 20:40:44.210078 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.210000 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="storage-initializer" Apr 16 20:40:44.210078 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.210025 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" Apr 16 20:40:44.210078 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.210034 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" Apr 16 20:40:44.210233 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.210111 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="e8ce21e5-3f56-4ac7-8e2f-ed90d47729e7" containerName="kserve-container" Apr 16 20:40:44.212988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.212970 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:44.222664 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.222646 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:40:44.287037 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.287013 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location\") pod \"isvc-primary-c25916-predictor-6c59bfbc4-mfpvn\" (UID: \"95f6b62d-3620-4017-b916-b1aed94144d1\") " pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:44.388102 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.388034 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location\") pod \"isvc-primary-c25916-predictor-6c59bfbc4-mfpvn\" (UID: \"95f6b62d-3620-4017-b916-b1aed94144d1\") " pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:44.388366 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.388347 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location\") pod \"isvc-primary-c25916-predictor-6c59bfbc4-mfpvn\" (UID: \"95f6b62d-3620-4017-b916-b1aed94144d1\") " pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:44.524817 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.524786 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:44.655153 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.655015 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:40:44.657726 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:40:44.657699 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95f6b62d_3620_4017_b916_b1aed94144d1.slice/crio-2895367b48357ef4e94a82b84f4b568f86bec7a60cb73d180688d827fa8cc3e0 WatchSource:0}: Error finding container 2895367b48357ef4e94a82b84f4b568f86bec7a60cb73d180688d827fa8cc3e0: Status 404 returned error can't find the container with id 2895367b48357ef4e94a82b84f4b568f86bec7a60cb73d180688d827fa8cc3e0 Apr 16 20:40:44.919038 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.919005 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerStarted","Data":"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b"} Apr 16 20:40:44.919260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:44.919044 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerStarted","Data":"2895367b48357ef4e94a82b84f4b568f86bec7a60cb73d180688d827fa8cc3e0"} Apr 16 20:40:47.173405 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.173379 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:40:47.209209 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.209180 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location\") pod \"bc96af47-db7c-479d-a08c-df1f951cb0b9\" (UID: \"bc96af47-db7c-479d-a08c-df1f951cb0b9\") " Apr 16 20:40:47.209472 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.209451 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc96af47-db7c-479d-a08c-df1f951cb0b9" (UID: "bc96af47-db7c-479d-a08c-df1f951cb0b9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:40:47.310058 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.310037 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc96af47-db7c-479d-a08c-df1f951cb0b9-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:40:47.929356 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.929324 2563 generic.go:358] "Generic (PLEG): container finished" podID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerID="15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27" exitCode=0 Apr 16 20:40:47.929518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.929382 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerDied","Data":"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27"} Apr 16 20:40:47.929518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.929409 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" event={"ID":"bc96af47-db7c-479d-a08c-df1f951cb0b9","Type":"ContainerDied","Data":"dd353cb98b2df6fa95d45d77d88c0b05c21e98dfc46900819823b7a4228bd24e"} Apr 16 20:40:47.929518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.929424 2563 scope.go:117] "RemoveContainer" containerID="15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27" Apr 16 20:40:47.929518 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.929385 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4" Apr 16 20:40:47.937409 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.937389 2563 scope.go:117] "RemoveContainer" containerID="4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00" Apr 16 20:40:47.944114 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.944097 2563 scope.go:117] "RemoveContainer" containerID="15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27" Apr 16 20:40:47.944372 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:40:47.944352 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27\": container with ID starting with 15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27 not found: ID does not exist" containerID="15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27" Apr 16 20:40:47.944462 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.944384 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27"} err="failed to get container status \"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27\": rpc error: code = NotFound desc = could not find container \"15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27\": container with ID starting with 15a6797928b219b19d7d6fb5a0d38402e88af325030b7d8468eae4610b75ac27 not found: ID does not exist" Apr 16 20:40:47.944462 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.944414 2563 scope.go:117] "RemoveContainer" containerID="4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00" Apr 16 20:40:47.945759 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:40:47.945741 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00\": container with ID starting with 4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00 not found: ID does not exist" containerID="4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00" Apr 16 20:40:47.945867 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.945767 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00"} err="failed to get container status \"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00\": rpc error: code = NotFound desc = could not find container \"4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00\": container with ID starting with 4e4256be96e103d580b6303bc8407b27285a825f12867d004553175745b59a00 not found: ID does not exist" Apr 16 20:40:47.946231 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.946214 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:40:47.949346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:47.949327 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-w5pk4"] Apr 16 20:40:48.934869 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:48.934833 2563 generic.go:358] "Generic (PLEG): container finished" podID="95f6b62d-3620-4017-b916-b1aed94144d1" containerID="655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b" exitCode=0 Apr 16 20:40:48.934869 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:48.934870 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerDied","Data":"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b"} Apr 16 20:40:49.299420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:49.299349 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" path="/var/lib/kubelet/pods/bc96af47-db7c-479d-a08c-df1f951cb0b9/volumes" Apr 16 20:40:49.940082 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:49.940011 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerStarted","Data":"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953"} Apr 16 20:40:49.940410 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:49.940365 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:40:49.941656 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:49.941631 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:40:49.956733 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:49.956688 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podStartSLOduration=5.956677184 podStartE2EDuration="5.956677184s" podCreationTimestamp="2026-04-16 20:40:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:40:49.954967995 +0000 UTC m=+1747.243893779" watchObservedRunningTime="2026-04-16 20:40:49.956677184 +0000 UTC m=+1747.245602968" Apr 16 20:40:50.948804 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:40:50.948763 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:00.949572 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:00.949526 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:10.948743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:10.948703 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:20.948684 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:20.948646 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:30.948718 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:30.948673 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:40.949641 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:40.949599 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 20:41:50.950260 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:50.950227 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:41:54.313408 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313368 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:41:54.313896 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313842 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" Apr 16 20:41:54.313896 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313860 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" Apr 16 20:41:54.313896 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313881 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="storage-initializer" Apr 16 20:41:54.313896 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313888 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="storage-initializer" Apr 16 20:41:54.314106 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.313966 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc96af47-db7c-479d-a08c-df1f951cb0b9" containerName="kserve-container" Apr 16 20:41:54.318033 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.318014 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.320638 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.320616 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-c25916\"" Apr 16 20:41:54.320751 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.320651 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 20:41:54.322109 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.322090 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-c25916-dockercfg-lhf4n\"" Apr 16 20:41:54.325189 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.325166 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:41:54.497323 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.497296 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.497468 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.497374 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.597684 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.597621 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.597784 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.597689 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.597952 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.597933 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.598165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.598148 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert\") pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.630907 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.630887 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:41:54.749346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:54.749317 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:41:54.752042 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:41:54.752006 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f6b4a88_2a9d_40ec_ac0a_c9a8873bcd9d.slice/crio-2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df WatchSource:0}: Error finding container 2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df: Status 404 returned error can't find the container with id 2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df Apr 16 20:41:55.150155 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:55.150116 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerStarted","Data":"4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db"} Apr 16 20:41:55.150155 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:41:55.150161 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerStarted","Data":"2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df"} Apr 16 20:42:00.168060 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:00.168034 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/0.log" Apr 16 20:42:00.168403 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:00.168070 2563 generic.go:358] "Generic (PLEG): container finished" podID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerID="4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db" exitCode=1 Apr 16 20:42:00.168403 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:00.168153 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerDied","Data":"4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db"} Apr 16 20:42:01.173502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:01.173478 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/0.log" Apr 16 20:42:01.173883 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:01.173584 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerStarted","Data":"c2b37ec56a73d7b458fe8405fd84cd3cd307c5f37dbca460d4954d4cc130e99a"} Apr 16 20:42:04.183928 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.183905 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/1.log" Apr 16 20:42:04.184304 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.184287 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/0.log" Apr 16 20:42:04.184390 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.184330 2563 generic.go:358] "Generic (PLEG): container finished" podID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerID="c2b37ec56a73d7b458fe8405fd84cd3cd307c5f37dbca460d4954d4cc130e99a" exitCode=1 Apr 16 20:42:04.184390 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.184382 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerDied","Data":"c2b37ec56a73d7b458fe8405fd84cd3cd307c5f37dbca460d4954d4cc130e99a"} Apr 16 20:42:04.184501 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.184423 2563 scope.go:117] "RemoveContainer" containerID="4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db" Apr 16 20:42:04.184739 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:04.184715 2563 scope.go:117] "RemoveContainer" containerID="4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db" Apr 16 20:42:04.194085 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:04.194055 2563 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c25916-predictor-66b7549787-jfhg8_kserve-ci-e2e-test_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d_0 in pod sandbox 2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df from index: no such id: '4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db'" containerID="4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db" Apr 16 20:42:04.194150 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:04.194110 2563 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-c25916-predictor-66b7549787-jfhg8_kserve-ci-e2e-test_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d_0 in pod sandbox 2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df from index: no such id: '4a09b86ecb5efc927c6f40a88337fd33085e812f2604c52dfed4837a23e103db'; Skipping pod \"isvc-secondary-c25916-predictor-66b7549787-jfhg8_kserve-ci-e2e-test(6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d)\"" logger="UnhandledError" Apr 16 20:42:04.195420 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:04.195401 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-c25916-predictor-66b7549787-jfhg8_kserve-ci-e2e-test(6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d)\"" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" Apr 16 20:42:05.189702 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:05.189673 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/1.log" Apr 16 20:42:12.379351 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.379320 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:42:12.379837 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.379585 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" containerID="cri-o://aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953" gracePeriod=30 Apr 16 20:42:12.443233 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.443204 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:42:12.514108 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.514006 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:12.518504 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.518485 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.521478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.521458 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-72f2aa\"" Apr 16 20:42:12.521478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.521459 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.521760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.521548 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.521760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.521645 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-72f2aa-dockercfg-ghsj7\"" Apr 16 20:42:12.526680 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.526572 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:12.565255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.565238 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/1.log" Apr 16 20:42:12.565338 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.565294 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:42:12.622262 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.622238 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.622392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.622282 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.622643 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.622628 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.622858 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.622840 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert\") pod \"isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.723446 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.723422 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert\") pod \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " Apr 16 20:42:12.723543 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.723466 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location\") pod \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\" (UID: \"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d\") " Apr 16 20:42:12.723753 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.723734 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" (UID: "6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:12.723792 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.723750 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" (UID: "6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:12.824865 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.824837 2563 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-cabundle-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:42:12.824865 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.824862 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:42:12.830786 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.830762 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:12.947275 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:12.947238 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:12.951110 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:42:12.951081 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3a892e0_270e_4930_bf93_7c877b2bef54.slice/crio-45f826dab746a16c72895929a099c3002f33723513494fd3981edbf4e04d00ef WatchSource:0}: Error finding container 45f826dab746a16c72895929a099c3002f33723513494fd3981edbf4e04d00ef: Status 404 returned error can't find the container with id 45f826dab746a16c72895929a099c3002f33723513494fd3981edbf4e04d00ef Apr 16 20:42:13.217434 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.217399 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerStarted","Data":"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098"} Apr 16 20:42:13.217434 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.217438 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerStarted","Data":"45f826dab746a16c72895929a099c3002f33723513494fd3981edbf4e04d00ef"} Apr 16 20:42:13.218668 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.218645 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-c25916-predictor-66b7549787-jfhg8_6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/storage-initializer/1.log" Apr 16 20:42:13.218783 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.218743 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" Apr 16 20:42:13.218878 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.218742 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8" event={"ID":"6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d","Type":"ContainerDied","Data":"2fa14f5d8fd69f6c7bea0604d59eb61bd52c29204f00507096d8b5ec19b371df"} Apr 16 20:42:13.218878 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.218863 2563 scope.go:117] "RemoveContainer" containerID="c2b37ec56a73d7b458fe8405fd84cd3cd307c5f37dbca460d4954d4cc130e99a" Apr 16 20:42:13.260528 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.260467 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:42:13.266344 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.266319 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-c25916-predictor-66b7549787-jfhg8"] Apr 16 20:42:13.299117 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:13.299093 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" path="/var/lib/kubelet/pods/6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d/volumes" Apr 16 20:42:16.113844 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.113822 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:42:16.145286 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.145264 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location\") pod \"95f6b62d-3620-4017-b916-b1aed94144d1\" (UID: \"95f6b62d-3620-4017-b916-b1aed94144d1\") " Apr 16 20:42:16.145569 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.145542 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "95f6b62d-3620-4017-b916-b1aed94144d1" (UID: "95f6b62d-3620-4017-b916-b1aed94144d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:16.230502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.230444 2563 generic.go:358] "Generic (PLEG): container finished" podID="95f6b62d-3620-4017-b916-b1aed94144d1" containerID="aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953" exitCode=0 Apr 16 20:42:16.230630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.230509 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" Apr 16 20:42:16.230630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.230531 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerDied","Data":"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953"} Apr 16 20:42:16.230630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.230592 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn" event={"ID":"95f6b62d-3620-4017-b916-b1aed94144d1","Type":"ContainerDied","Data":"2895367b48357ef4e94a82b84f4b568f86bec7a60cb73d180688d827fa8cc3e0"} Apr 16 20:42:16.230630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.230615 2563 scope.go:117] "RemoveContainer" containerID="aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953" Apr 16 20:42:16.238017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.237996 2563 scope.go:117] "RemoveContainer" containerID="655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b" Apr 16 20:42:16.244689 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.244675 2563 scope.go:117] "RemoveContainer" containerID="aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953" Apr 16 20:42:16.244921 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:16.244903 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953\": container with ID starting with aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953 not found: ID does not exist" containerID="aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953" Apr 16 20:42:16.244982 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.244931 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953"} err="failed to get container status \"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953\": rpc error: code = NotFound desc = could not find container \"aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953\": container with ID starting with aa125d71f88e7e95a4fc8d476cb1e4266fc7e75ef7f004cf5baa5595fd31a953 not found: ID does not exist" Apr 16 20:42:16.244982 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.244947 2563 scope.go:117] "RemoveContainer" containerID="655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b" Apr 16 20:42:16.245171 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:16.245153 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b\": container with ID starting with 655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b not found: ID does not exist" containerID="655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b" Apr 16 20:42:16.245222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.245176 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b"} err="failed to get container status \"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b\": rpc error: code = NotFound desc = could not find container \"655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b\": container with ID starting with 655646b8eec20a9205fbc7b5817bf1a2a2abdbc91f693b49188cedf5e198742b not found: ID does not exist" Apr 16 20:42:16.246526 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.246513 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/95f6b62d-3620-4017-b916-b1aed94144d1-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:42:16.264144 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.259742 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:42:16.266473 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:16.266441 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-c25916-predictor-6c59bfbc4-mfpvn"] Apr 16 20:42:17.298678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:17.298645 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" path="/var/lib/kubelet/pods/95f6b62d-3620-4017-b916-b1aed94144d1/volumes" Apr 16 20:42:18.240097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:18.240037 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/0.log" Apr 16 20:42:18.240097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:18.240072 2563 generic.go:358] "Generic (PLEG): container finished" podID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerID="9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098" exitCode=1 Apr 16 20:42:18.240247 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:18.240126 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerDied","Data":"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098"} Apr 16 20:42:19.245097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:19.245068 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/0.log" Apr 16 20:42:19.245546 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:19.245123 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerStarted","Data":"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03"} Apr 16 20:42:22.541866 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.541835 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:22.542239 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.542145 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" containerID="cri-o://b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03" gracePeriod=30 Apr 16 20:42:22.698613 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698578 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:42:22.698932 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698919 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.698975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698935 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.698975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698952 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.698975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698960 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.698975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698971 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698977 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698987 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="storage-initializer" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.698993 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="storage-initializer" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.699049 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.699059 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f6b4a88-2a9d-40ec-ac0a-c9a8873bcd9d" containerName="storage-initializer" Apr 16 20:42:22.699096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.699066 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="95f6b62d-3620-4017-b916-b1aed94144d1" containerName="kserve-container" Apr 16 20:42:22.702014 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.701997 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:22.704716 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.704695 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-757nb\"" Apr 16 20:42:22.709936 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.709030 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:42:22.774910 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.774890 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/1.log" Apr 16 20:42:22.775232 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.775217 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/0.log" Apr 16 20:42:22.775299 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.775276 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:22.795476 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795415 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location\") pod \"a3a892e0-270e-4930-bf93-7c877b2bef54\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " Apr 16 20:42:22.795588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795488 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert\") pod \"a3a892e0-270e-4930-bf93-7c877b2bef54\" (UID: \"a3a892e0-270e-4930-bf93-7c877b2bef54\") " Apr 16 20:42:22.795707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795685 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a3a892e0-270e-4930-bf93-7c877b2bef54" (UID: "a3a892e0-270e-4930-bf93-7c877b2bef54"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:42:22.795776 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795702 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-2phnz\" (UID: \"a785044a-3b36-464a-a37e-146a0a2f44fa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:22.795776 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795750 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a3a892e0-270e-4930-bf93-7c877b2bef54-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:42:22.795883 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.795832 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a3a892e0-270e-4930-bf93-7c877b2bef54" (UID: "a3a892e0-270e-4930-bf93-7c877b2bef54"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 20:42:22.896806 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.896781 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-2phnz\" (UID: \"a785044a-3b36-464a-a37e-146a0a2f44fa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:22.896922 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.896886 2563 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a3a892e0-270e-4930-bf93-7c877b2bef54-cabundle-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:42:22.897123 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:22.897107 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-2phnz\" (UID: \"a785044a-3b36-464a-a37e-146a0a2f44fa\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:23.015107 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.015085 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:23.130097 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.130072 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:42:23.132181 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:42:23.132147 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda785044a_3b36_464a_a37e_146a0a2f44fa.slice/crio-8511627d925010e0db050f9a75bd09e2a171ce14ddf90da8cbab94fd6edba3ad WatchSource:0}: Error finding container 8511627d925010e0db050f9a75bd09e2a171ce14ddf90da8cbab94fd6edba3ad: Status 404 returned error can't find the container with id 8511627d925010e0db050f9a75bd09e2a171ce14ddf90da8cbab94fd6edba3ad Apr 16 20:42:23.259076 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259055 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/1.log" Apr 16 20:42:23.259415 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259400 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55_a3a892e0-270e-4930-bf93-7c877b2bef54/storage-initializer/0.log" Apr 16 20:42:23.259474 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259435 2563 generic.go:358] "Generic (PLEG): container finished" podID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerID="b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03" exitCode=1 Apr 16 20:42:23.259513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259500 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" Apr 16 20:42:23.259586 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259527 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerDied","Data":"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03"} Apr 16 20:42:23.259647 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259592 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55" event={"ID":"a3a892e0-270e-4930-bf93-7c877b2bef54","Type":"ContainerDied","Data":"45f826dab746a16c72895929a099c3002f33723513494fd3981edbf4e04d00ef"} Apr 16 20:42:23.259647 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.259615 2563 scope.go:117] "RemoveContainer" containerID="b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03" Apr 16 20:42:23.261204 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.261165 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerStarted","Data":"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5"} Apr 16 20:42:23.261204 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.261200 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerStarted","Data":"8511627d925010e0db050f9a75bd09e2a171ce14ddf90da8cbab94fd6edba3ad"} Apr 16 20:42:23.267729 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.267645 2563 scope.go:117] "RemoveContainer" containerID="9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098" Apr 16 20:42:23.274098 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.274079 2563 scope.go:117] "RemoveContainer" containerID="b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03" Apr 16 20:42:23.274335 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:23.274316 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03\": container with ID starting with b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03 not found: ID does not exist" containerID="b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03" Apr 16 20:42:23.274403 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.274347 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03"} err="failed to get container status \"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03\": rpc error: code = NotFound desc = could not find container \"b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03\": container with ID starting with b3db82ce838bb1d19350bced10ea51ac6c91f93c00a3a77c915d2f7345cf1d03 not found: ID does not exist" Apr 16 20:42:23.274403 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.274371 2563 scope.go:117] "RemoveContainer" containerID="9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098" Apr 16 20:42:23.274608 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:42:23.274586 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098\": container with ID starting with 9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098 not found: ID does not exist" containerID="9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098" Apr 16 20:42:23.274677 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.274616 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098"} err="failed to get container status \"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098\": rpc error: code = NotFound desc = could not find container \"9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098\": container with ID starting with 9d36b49595a1ec6b48b48d168a81b7c7fa8296a53e3755b178187ea4b01ff098 not found: ID does not exist" Apr 16 20:42:23.305373 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.305350 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:23.309002 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:23.308940 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-72f2aa-predictor-d467fb9d7-qlt55"] Apr 16 20:42:25.298605 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:25.298554 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" path="/var/lib/kubelet/pods/a3a892e0-270e-4930-bf93-7c877b2bef54/volumes" Apr 16 20:42:27.275847 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:27.275816 2563 generic.go:358] "Generic (PLEG): container finished" podID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerID="02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5" exitCode=0 Apr 16 20:42:27.276185 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:27.275891 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerDied","Data":"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5"} Apr 16 20:42:48.347976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:48.347941 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerStarted","Data":"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895"} Apr 16 20:42:48.348448 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:48.348233 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:42:48.349347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:48.349321 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:42:48.365802 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:48.365397 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podStartSLOduration=6.217303764 podStartE2EDuration="26.365379718s" podCreationTimestamp="2026-04-16 20:42:22 +0000 UTC" firstStartedPulling="2026-04-16 20:42:27.277094245 +0000 UTC m=+1844.566020010" lastFinishedPulling="2026-04-16 20:42:47.425170193 +0000 UTC m=+1864.714095964" observedRunningTime="2026-04-16 20:42:48.363449644 +0000 UTC m=+1865.652375420" watchObservedRunningTime="2026-04-16 20:42:48.365379718 +0000 UTC m=+1865.654305504" Apr 16 20:42:49.351441 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:49.351400 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:42:59.351808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:42:59.351767 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:09.351735 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:09.351646 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:19.352379 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:19.352333 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:29.352023 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:29.351983 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:39.351813 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:39.351771 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:49.351757 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:49.351714 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 20:43:57.300284 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:43:57.300246 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:44:02.838626 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.838595 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:44:02.839090 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.838859 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" containerID="cri-o://8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895" gracePeriod=30 Apr 16 20:44:02.935247 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935213 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:44:02.935571 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935542 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.935571 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935569 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.935692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935580 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.935692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935586 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.935692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935662 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.935692 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.935673 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a3a892e0-270e-4930-bf93-7c877b2bef54" containerName="storage-initializer" Apr 16 20:44:02.938591 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.938576 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:02.946267 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:02.946209 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:44:03.059263 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.059235 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht\" (UID: \"7211ff84-7df1-4ec6-9e05-0b4309da5e62\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:03.159825 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.159742 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht\" (UID: \"7211ff84-7df1-4ec6-9e05-0b4309da5e62\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:03.160096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.160077 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht\" (UID: \"7211ff84-7df1-4ec6-9e05-0b4309da5e62\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:03.250368 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.250333 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:03.365333 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.365300 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:44:03.368636 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:44:03.368609 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7211ff84_7df1_4ec6_9e05_0b4309da5e62.slice/crio-96b6dd4eceff4a7e2ce919824991665575ba319ea717d0cf4d8a6963acf893c7 WatchSource:0}: Error finding container 96b6dd4eceff4a7e2ce919824991665575ba319ea717d0cf4d8a6963acf893c7: Status 404 returned error can't find the container with id 96b6dd4eceff4a7e2ce919824991665575ba319ea717d0cf4d8a6963acf893c7 Apr 16 20:44:03.566894 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.566853 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerStarted","Data":"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63"} Apr 16 20:44:03.567072 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:03.566901 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerStarted","Data":"96b6dd4eceff4a7e2ce919824991665575ba319ea717d0cf4d8a6963acf893c7"} Apr 16 20:44:06.875812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:06.875789 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:44:06.986121 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:06.986061 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location\") pod \"a785044a-3b36-464a-a37e-146a0a2f44fa\" (UID: \"a785044a-3b36-464a-a37e-146a0a2f44fa\") " Apr 16 20:44:06.986337 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:06.986315 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a785044a-3b36-464a-a37e-146a0a2f44fa" (UID: "a785044a-3b36-464a-a37e-146a0a2f44fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:44:07.086706 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.086684 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a785044a-3b36-464a-a37e-146a0a2f44fa-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:44:07.580776 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.580702 2563 generic.go:358] "Generic (PLEG): container finished" podID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerID="8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895" exitCode=0 Apr 16 20:44:07.580776 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.580768 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" Apr 16 20:44:07.580960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.580789 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerDied","Data":"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895"} Apr 16 20:44:07.580960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.580827 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz" event={"ID":"a785044a-3b36-464a-a37e-146a0a2f44fa","Type":"ContainerDied","Data":"8511627d925010e0db050f9a75bd09e2a171ce14ddf90da8cbab94fd6edba3ad"} Apr 16 20:44:07.580960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.580844 2563 scope.go:117] "RemoveContainer" containerID="8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895" Apr 16 20:44:07.582130 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.582097 2563 generic.go:358] "Generic (PLEG): container finished" podID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerID="495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63" exitCode=0 Apr 16 20:44:07.582207 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.582152 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerDied","Data":"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63"} Apr 16 20:44:07.589196 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.589184 2563 scope.go:117] "RemoveContainer" containerID="02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5" Apr 16 20:44:07.595995 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.595978 2563 scope.go:117] "RemoveContainer" containerID="8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895" Apr 16 20:44:07.596286 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:44:07.596265 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895\": container with ID starting with 8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895 not found: ID does not exist" containerID="8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895" Apr 16 20:44:07.596365 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.596291 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895"} err="failed to get container status \"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895\": rpc error: code = NotFound desc = could not find container \"8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895\": container with ID starting with 8d001f5c0dffd9d33b27f20e8c27579ef4f9c1343dd6fc53d0e57bf1c65f3895 not found: ID does not exist" Apr 16 20:44:07.596365 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.596310 2563 scope.go:117] "RemoveContainer" containerID="02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5" Apr 16 20:44:07.596553 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:44:07.596536 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5\": container with ID starting with 02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5 not found: ID does not exist" containerID="02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5" Apr 16 20:44:07.596617 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.596573 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5"} err="failed to get container status \"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5\": rpc error: code = NotFound desc = could not find container \"02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5\": container with ID starting with 02fc903d9cf323b57f6bc6fdb6971cc6a09bc065bd51aec6c353334cd79963a5 not found: ID does not exist" Apr 16 20:44:07.611252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.611224 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:44:07.615022 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:07.615001 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-2phnz"] Apr 16 20:44:08.587399 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:08.587370 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerStarted","Data":"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606"} Apr 16 20:44:08.587796 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:08.587660 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:44:08.588900 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:08.588876 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:08.604736 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:08.604692 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podStartSLOduration=6.604681824 podStartE2EDuration="6.604681824s" podCreationTimestamp="2026-04-16 20:44:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:44:08.603161868 +0000 UTC m=+1945.892087688" watchObservedRunningTime="2026-04-16 20:44:08.604681824 +0000 UTC m=+1945.893607610" Apr 16 20:44:09.298717 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:09.298682 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" path="/var/lib/kubelet/pods/a785044a-3b36-464a-a37e-146a0a2f44fa/volumes" Apr 16 20:44:09.591096 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:09.591006 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:19.591048 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:19.591013 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:29.591816 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:29.591776 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:39.592007 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:39.591924 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:49.591788 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:49.591743 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:44:59.591238 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:44:59.591190 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:45:09.591656 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:09.591617 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:45:11.298983 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:11.298948 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 20:45:21.300977 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:21.300947 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:45:23.031426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.031387 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:45:23.031876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.031655 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" containerID="cri-o://3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606" gracePeriod=30 Apr 16 20:45:23.078803 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.078776 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:45:23.079091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.079080 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" Apr 16 20:45:23.079135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.079093 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" Apr 16 20:45:23.079135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.079103 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="storage-initializer" Apr 16 20:45:23.079135 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.079109 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="storage-initializer" Apr 16 20:45:23.079227 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.079159 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a785044a-3b36-464a-a37e-146a0a2f44fa" containerName="kserve-container" Apr 16 20:45:23.082066 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.082049 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:23.089094 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.089073 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:45:23.137440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.137415 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-x9vtk\" (UID: \"4861a254-3c43-4b14-b080-7c4a5560914f\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:23.237782 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.237754 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-x9vtk\" (UID: \"4861a254-3c43-4b14-b080-7c4a5560914f\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:23.238070 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.238052 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-x9vtk\" (UID: \"4861a254-3c43-4b14-b080-7c4a5560914f\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:23.394527 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.394457 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:23.509222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.509199 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:45:23.511219 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:45:23.511195 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4861a254_3c43_4b14_b080_7c4a5560914f.slice/crio-a48fbd9fee74f76f17b407de11e2ea4d9bda3f707b475b64cd3d42bfbf16cbcd WatchSource:0}: Error finding container a48fbd9fee74f76f17b407de11e2ea4d9bda3f707b475b64cd3d42bfbf16cbcd: Status 404 returned error can't find the container with id a48fbd9fee74f76f17b407de11e2ea4d9bda3f707b475b64cd3d42bfbf16cbcd Apr 16 20:45:23.513031 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.513016 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:45:23.815938 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.815905 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerStarted","Data":"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5"} Apr 16 20:45:23.815938 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:23.815942 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerStarted","Data":"a48fbd9fee74f76f17b407de11e2ea4d9bda3f707b475b64cd3d42bfbf16cbcd"} Apr 16 20:45:27.077929 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.077906 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:45:27.169138 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.169114 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location\") pod \"7211ff84-7df1-4ec6-9e05-0b4309da5e62\" (UID: \"7211ff84-7df1-4ec6-9e05-0b4309da5e62\") " Apr 16 20:45:27.170252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.170228 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7211ff84-7df1-4ec6-9e05-0b4309da5e62" (UID: "7211ff84-7df1-4ec6-9e05-0b4309da5e62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:45:27.270104 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.270046 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7211ff84-7df1-4ec6-9e05-0b4309da5e62-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:45:27.831258 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.831226 2563 generic.go:358] "Generic (PLEG): container finished" podID="4861a254-3c43-4b14-b080-7c4a5560914f" containerID="ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5" exitCode=0 Apr 16 20:45:27.831440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.831299 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerDied","Data":"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5"} Apr 16 20:45:27.835769 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.835747 2563 generic.go:358] "Generic (PLEG): container finished" podID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerID="3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606" exitCode=0 Apr 16 20:45:27.835877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.835806 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerDied","Data":"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606"} Apr 16 20:45:27.835877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.835817 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" Apr 16 20:45:27.835877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.835841 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht" event={"ID":"7211ff84-7df1-4ec6-9e05-0b4309da5e62","Type":"ContainerDied","Data":"96b6dd4eceff4a7e2ce919824991665575ba319ea717d0cf4d8a6963acf893c7"} Apr 16 20:45:27.835877 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.835876 2563 scope.go:117] "RemoveContainer" containerID="3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606" Apr 16 20:45:27.846475 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.846452 2563 scope.go:117] "RemoveContainer" containerID="495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63" Apr 16 20:45:27.860299 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.860272 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:45:27.861056 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.861030 2563 scope.go:117] "RemoveContainer" containerID="3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606" Apr 16 20:45:27.861330 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:45:27.861306 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606\": container with ID starting with 3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606 not found: ID does not exist" containerID="3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606" Apr 16 20:45:27.861418 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.861342 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606"} err="failed to get container status \"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606\": rpc error: code = NotFound desc = could not find container \"3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606\": container with ID starting with 3d690f3a2772d09f919e3e7042e42d2334a17fbea5c29c70210726c0eb9b3606 not found: ID does not exist" Apr 16 20:45:27.861418 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.861364 2563 scope.go:117] "RemoveContainer" containerID="495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63" Apr 16 20:45:27.861629 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:45:27.861608 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63\": container with ID starting with 495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63 not found: ID does not exist" containerID="495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63" Apr 16 20:45:27.861738 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.861634 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63"} err="failed to get container status \"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63\": rpc error: code = NotFound desc = could not find container \"495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63\": container with ID starting with 495452440ce3b7f8172cbf8c2ba17cc7320c5ae071a67f5cd813828dae9ffd63 not found: ID does not exist" Apr 16 20:45:27.862105 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:27.862087 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nb8ht"] Apr 16 20:45:28.840245 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:28.840213 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerStarted","Data":"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421"} Apr 16 20:45:28.840739 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:28.840583 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:45:28.841960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:28.841933 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:45:28.859182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:28.859125 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podStartSLOduration=5.859109278 podStartE2EDuration="5.859109278s" podCreationTimestamp="2026-04-16 20:45:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:45:28.856788937 +0000 UTC m=+2026.145714723" watchObservedRunningTime="2026-04-16 20:45:28.859109278 +0000 UTC m=+2026.148035065" Apr 16 20:45:29.298773 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:29.298740 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" path="/var/lib/kubelet/pods/7211ff84-7df1-4ec6-9e05-0b4309da5e62/volumes" Apr 16 20:45:29.844272 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:29.844228 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:45:39.844333 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:39.844295 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:45:49.844923 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:49.844876 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:45:59.844379 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:45:59.844331 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:46:09.844416 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:09.844322 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:46:19.844439 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:19.844399 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:46:29.845125 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:29.845087 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 20:46:39.845235 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:39.845206 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:46:43.218321 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.218291 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:46:43.218727 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.218551 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" containerID="cri-o://1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421" gracePeriod=30 Apr 16 20:46:43.269829 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.269802 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:46:43.270115 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.270100 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="storage-initializer" Apr 16 20:46:43.270177 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.270117 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="storage-initializer" Apr 16 20:46:43.270177 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.270135 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" Apr 16 20:46:43.270177 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.270141 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" Apr 16 20:46:43.270274 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.270192 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="7211ff84-7df1-4ec6-9e05-0b4309da5e62" containerName="kserve-container" Apr 16 20:46:43.273060 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.273046 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:43.283493 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.283461 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:46:43.339614 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.339574 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6\" (UID: \"e2956fad-9c18-4236-a62e-0cd71c000b5b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:43.440068 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.440039 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6\" (UID: \"e2956fad-9c18-4236-a62e-0cd71c000b5b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:43.440409 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.440387 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6\" (UID: \"e2956fad-9c18-4236-a62e-0cd71c000b5b\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:43.588000 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.587921 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:43.708815 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:43.708777 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:46:43.712527 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:46:43.712496 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2956fad_9c18_4236_a62e_0cd71c000b5b.slice/crio-93b6cc6ce43aa2d0e372c9a4d39e6dd8bcbfc22278f055877a61fbd4336ef8ea WatchSource:0}: Error finding container 93b6cc6ce43aa2d0e372c9a4d39e6dd8bcbfc22278f055877a61fbd4336ef8ea: Status 404 returned error can't find the container with id 93b6cc6ce43aa2d0e372c9a4d39e6dd8bcbfc22278f055877a61fbd4336ef8ea Apr 16 20:46:44.068675 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:44.068616 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerStarted","Data":"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152"} Apr 16 20:46:44.068675 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:44.068671 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerStarted","Data":"93b6cc6ce43aa2d0e372c9a4d39e6dd8bcbfc22278f055877a61fbd4336ef8ea"} Apr 16 20:46:47.862547 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:47.862521 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:46:47.977928 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:47.977900 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location\") pod \"4861a254-3c43-4b14-b080-7c4a5560914f\" (UID: \"4861a254-3c43-4b14-b080-7c4a5560914f\") " Apr 16 20:46:47.978187 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:47.978163 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4861a254-3c43-4b14-b080-7c4a5560914f" (UID: "4861a254-3c43-4b14-b080-7c4a5560914f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:46:48.079037 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.078983 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4861a254-3c43-4b14-b080-7c4a5560914f-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:46:48.083934 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.083904 2563 generic.go:358] "Generic (PLEG): container finished" podID="4861a254-3c43-4b14-b080-7c4a5560914f" containerID="1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421" exitCode=0 Apr 16 20:46:48.084041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.083984 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" Apr 16 20:46:48.084041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.083994 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerDied","Data":"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421"} Apr 16 20:46:48.084159 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.084042 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk" event={"ID":"4861a254-3c43-4b14-b080-7c4a5560914f","Type":"ContainerDied","Data":"a48fbd9fee74f76f17b407de11e2ea4d9bda3f707b475b64cd3d42bfbf16cbcd"} Apr 16 20:46:48.084159 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.084059 2563 scope.go:117] "RemoveContainer" containerID="1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421" Apr 16 20:46:48.088884 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.088859 2563 generic.go:358] "Generic (PLEG): container finished" podID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerID="6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152" exitCode=0 Apr 16 20:46:48.088972 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.088908 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerDied","Data":"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152"} Apr 16 20:46:48.095706 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.095686 2563 scope.go:117] "RemoveContainer" containerID="ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5" Apr 16 20:46:48.102642 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.102623 2563 scope.go:117] "RemoveContainer" containerID="1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421" Apr 16 20:46:48.102903 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:46:48.102884 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421\": container with ID starting with 1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421 not found: ID does not exist" containerID="1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421" Apr 16 20:46:48.102975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.102911 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421"} err="failed to get container status \"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421\": rpc error: code = NotFound desc = could not find container \"1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421\": container with ID starting with 1a8825d758d685d26afa73173c6252ba4edaa169eb74c0d820adec6de99b1421 not found: ID does not exist" Apr 16 20:46:48.102975 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.102928 2563 scope.go:117] "RemoveContainer" containerID="ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5" Apr 16 20:46:48.103173 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:46:48.103151 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5\": container with ID starting with ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5 not found: ID does not exist" containerID="ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5" Apr 16 20:46:48.103230 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.103180 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5"} err="failed to get container status \"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5\": rpc error: code = NotFound desc = could not find container \"ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5\": container with ID starting with ae4d5def84f02088848c554e149bd068a0409a917c375b0293ba6737ea7b6be5 not found: ID does not exist" Apr 16 20:46:48.118762 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.118738 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:46:48.127586 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:48.125047 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-x9vtk"] Apr 16 20:46:49.094249 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:49.094204 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerStarted","Data":"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b"} Apr 16 20:46:49.094667 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:49.094427 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:46:49.112993 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:49.112949 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podStartSLOduration=6.112934295 podStartE2EDuration="6.112934295s" podCreationTimestamp="2026-04-16 20:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:46:49.11093113 +0000 UTC m=+2106.399856930" watchObservedRunningTime="2026-04-16 20:46:49.112934295 +0000 UTC m=+2106.401860079" Apr 16 20:46:49.299603 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:46:49.299548 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" path="/var/lib/kubelet/pods/4861a254-3c43-4b14-b080-7c4a5560914f/volumes" Apr 16 20:47:20.099615 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:47:20.099548 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:47:30.098988 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:47:30.098949 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:47:40.098357 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:47:40.098279 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:47:50.098341 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:47:50.098301 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.49:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 20:48:00.102122 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:00.102071 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:48:03.398392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.398356 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:48:03.398771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.398729 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" containerID="cri-o://6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b" gracePeriod=30 Apr 16 20:48:03.476156 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476116 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:48:03.476514 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476497 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" Apr 16 20:48:03.476627 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476517 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" Apr 16 20:48:03.476627 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476535 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="storage-initializer" Apr 16 20:48:03.476627 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476543 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="storage-initializer" Apr 16 20:48:03.476800 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.476658 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="4861a254-3c43-4b14-b080-7c4a5560914f" containerName="kserve-container" Apr 16 20:48:03.480754 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.480734 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:03.510836 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.510806 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:48:03.534895 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.534842 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf\" (UID: \"0ee2c14e-399c-406c-bd45-f02a6f268a01\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:03.636231 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.636198 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf\" (UID: \"0ee2c14e-399c-406c-bd45-f02a6f268a01\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:03.636580 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.636540 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf\" (UID: \"0ee2c14e-399c-406c-bd45-f02a6f268a01\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:03.791450 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.791415 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:03.908931 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:03.908896 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:48:03.911867 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:48:03.911839 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ee2c14e_399c_406c_bd45_f02a6f268a01.slice/crio-0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941 WatchSource:0}: Error finding container 0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941: Status 404 returned error can't find the container with id 0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941 Apr 16 20:48:04.333978 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:04.333930 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerStarted","Data":"4135b5d97e0fc04204c3a9a4f8cbaf9aee7e62bbfc992b5939423e775c7865ac"} Apr 16 20:48:04.333978 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:04.333976 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerStarted","Data":"0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941"} Apr 16 20:48:08.137782 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.137755 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:48:08.170822 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.170781 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location\") pod \"e2956fad-9c18-4236-a62e-0cd71c000b5b\" (UID: \"e2956fad-9c18-4236-a62e-0cd71c000b5b\") " Apr 16 20:48:08.171130 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.171104 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e2956fad-9c18-4236-a62e-0cd71c000b5b" (UID: "e2956fad-9c18-4236-a62e-0cd71c000b5b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:48:08.272250 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.272155 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e2956fad-9c18-4236-a62e-0cd71c000b5b-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:48:08.347811 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.347778 2563 generic.go:358] "Generic (PLEG): container finished" podID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerID="4135b5d97e0fc04204c3a9a4f8cbaf9aee7e62bbfc992b5939423e775c7865ac" exitCode=0 Apr 16 20:48:08.347959 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.347861 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerDied","Data":"4135b5d97e0fc04204c3a9a4f8cbaf9aee7e62bbfc992b5939423e775c7865ac"} Apr 16 20:48:08.349347 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.349319 2563 generic.go:358] "Generic (PLEG): container finished" podID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerID="6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b" exitCode=0 Apr 16 20:48:08.349456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.349376 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerDied","Data":"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b"} Apr 16 20:48:08.349456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.349406 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" event={"ID":"e2956fad-9c18-4236-a62e-0cd71c000b5b","Type":"ContainerDied","Data":"93b6cc6ce43aa2d0e372c9a4d39e6dd8bcbfc22278f055877a61fbd4336ef8ea"} Apr 16 20:48:08.349456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.349411 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6" Apr 16 20:48:08.349456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.349427 2563 scope.go:117] "RemoveContainer" containerID="6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b" Apr 16 20:48:08.357391 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.357219 2563 scope.go:117] "RemoveContainer" containerID="6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152" Apr 16 20:48:08.364758 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.364739 2563 scope.go:117] "RemoveContainer" containerID="6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b" Apr 16 20:48:08.365170 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:48:08.365150 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b\": container with ID starting with 6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b not found: ID does not exist" containerID="6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b" Apr 16 20:48:08.365247 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.365179 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b"} err="failed to get container status \"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b\": rpc error: code = NotFound desc = could not find container \"6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b\": container with ID starting with 6b4c0b3a7c7afc6ad43cf3ace5adb6dd777b8015dcf8a3bda9404f485f45667b not found: ID does not exist" Apr 16 20:48:08.365247 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.365197 2563 scope.go:117] "RemoveContainer" containerID="6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152" Apr 16 20:48:08.365457 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:48:08.365443 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152\": container with ID starting with 6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152 not found: ID does not exist" containerID="6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152" Apr 16 20:48:08.365505 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.365461 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152"} err="failed to get container status \"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152\": rpc error: code = NotFound desc = could not find container \"6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152\": container with ID starting with 6fedab91004d1b2e47ec7130390b9880be48da55f043b1e896d666189436b152 not found: ID does not exist" Apr 16 20:48:08.376222 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.376198 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:48:08.381024 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:08.381002 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-jrpv6"] Apr 16 20:48:09.299069 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:09.299033 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" path="/var/lib/kubelet/pods/e2956fad-9c18-4236-a62e-0cd71c000b5b/volumes" Apr 16 20:48:09.355485 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:09.355446 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerStarted","Data":"1a2a87f6bde94561b788e1701fcad265e586a6c96a247e8eaf46768456aca3d7"} Apr 16 20:48:09.355678 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:09.355662 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:48:09.373835 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:09.373783 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podStartSLOduration=6.373767343 podStartE2EDuration="6.373767343s" podCreationTimestamp="2026-04-16 20:48:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:48:09.37283094 +0000 UTC m=+2186.661756724" watchObservedRunningTime="2026-04-16 20:48:09.373767343 +0000 UTC m=+2186.662693129" Apr 16 20:48:40.360990 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:40.360945 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.50:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 20:48:50.359662 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:48:50.359622 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.50:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 20:49:00.359223 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:00.359180 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.50:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 20:49:10.359364 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:10.359282 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.50:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 20:49:20.363285 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:20.363251 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:49:23.571701 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.571670 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:49:23.572130 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.571918 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" containerID="cri-o://1a2a87f6bde94561b788e1701fcad265e586a6c96a247e8eaf46768456aca3d7" gracePeriod=30 Apr 16 20:49:23.630901 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.630866 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:49:23.631251 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.631235 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="storage-initializer" Apr 16 20:49:23.631312 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.631255 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="storage-initializer" Apr 16 20:49:23.631312 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.631295 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" Apr 16 20:49:23.631312 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.631304 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" Apr 16 20:49:23.631434 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.631384 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="e2956fad-9c18-4236-a62e-0cd71c000b5b" containerName="kserve-container" Apr 16 20:49:23.634440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.634421 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:23.643850 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.643828 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:49:23.727973 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.727934 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb\" (UID: \"84deeb0e-9eda-42d3-a2bd-c1021b014e13\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:23.829062 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.828995 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb\" (UID: \"84deeb0e-9eda-42d3-a2bd-c1021b014e13\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:23.829337 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.829318 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb\" (UID: \"84deeb0e-9eda-42d3-a2bd-c1021b014e13\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:23.946729 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:23.946706 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:24.060358 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:24.060323 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:49:24.062306 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:49:24.062277 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84deeb0e_9eda_42d3_a2bd_c1021b014e13.slice/crio-207e0e8f1d3f537f7399757332de742806920a1a7526dc2ac521616fd2430868 WatchSource:0}: Error finding container 207e0e8f1d3f537f7399757332de742806920a1a7526dc2ac521616fd2430868: Status 404 returned error can't find the container with id 207e0e8f1d3f537f7399757332de742806920a1a7526dc2ac521616fd2430868 Apr 16 20:49:24.588076 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:24.588044 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerStarted","Data":"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2"} Apr 16 20:49:24.588432 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:24.588082 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerStarted","Data":"207e0e8f1d3f537f7399757332de742806920a1a7526dc2ac521616fd2430868"} Apr 16 20:49:27.601351 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.601315 2563 generic.go:358] "Generic (PLEG): container finished" podID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerID="1a2a87f6bde94561b788e1701fcad265e586a6c96a247e8eaf46768456aca3d7" exitCode=0 Apr 16 20:49:27.601659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.601375 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerDied","Data":"1a2a87f6bde94561b788e1701fcad265e586a6c96a247e8eaf46768456aca3d7"} Apr 16 20:49:27.601659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.601407 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" event={"ID":"0ee2c14e-399c-406c-bd45-f02a6f268a01","Type":"ContainerDied","Data":"0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941"} Apr 16 20:49:27.601659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.601417 2563 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0fe4679cf0a7f1de9fd0267e6b671cae2b552214e0590351f9aacf008ad9a941" Apr 16 20:49:27.607925 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.607910 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:49:27.659291 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.659270 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location\") pod \"0ee2c14e-399c-406c-bd45-f02a6f268a01\" (UID: \"0ee2c14e-399c-406c-bd45-f02a6f268a01\") " Apr 16 20:49:27.659609 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.659582 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0ee2c14e-399c-406c-bd45-f02a6f268a01" (UID: "0ee2c14e-399c-406c-bd45-f02a6f268a01"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:49:27.760410 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:27.760352 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0ee2c14e-399c-406c-bd45-f02a6f268a01-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:49:28.605731 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:28.605695 2563 generic.go:358] "Generic (PLEG): container finished" podID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerID="e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2" exitCode=0 Apr 16 20:49:28.606200 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:28.605767 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerDied","Data":"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2"} Apr 16 20:49:28.606200 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:28.605932 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf" Apr 16 20:49:28.635083 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:28.635053 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:49:28.640133 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:28.640103 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-kgxdf"] Apr 16 20:49:29.298148 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:29.298113 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" path="/var/lib/kubelet/pods/0ee2c14e-399c-406c-bd45-f02a6f268a01/volumes" Apr 16 20:49:29.610293 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:29.610212 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerStarted","Data":"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5"} Apr 16 20:49:29.610645 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:29.610433 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:49:29.627971 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:49:29.627932 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podStartSLOduration=6.62792001 podStartE2EDuration="6.62792001s" podCreationTimestamp="2026-04-16 20:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:49:29.626795907 +0000 UTC m=+2266.915721692" watchObservedRunningTime="2026-04-16 20:49:29.62792001 +0000 UTC m=+2266.916845796" Apr 16 20:50:00.615743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:00.615702 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.51:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 20:50:10.615041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:10.614998 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.51:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 20:50:20.614985 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:20.614943 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.51:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 20:50:30.614182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:30.614136 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.51:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 20:50:40.618364 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:40.618279 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:50:43.769608 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:43.769547 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:50:43.770160 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:43.769811 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" containerID="cri-o://688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5" gracePeriod=30 Apr 16 20:50:45.956727 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.956693 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:50:45.957165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.957031 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" Apr 16 20:50:45.957165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.957042 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" Apr 16 20:50:45.957165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.957058 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="storage-initializer" Apr 16 20:50:45.957165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.957064 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="storage-initializer" Apr 16 20:50:45.957165 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.957115 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="0ee2c14e-399c-406c-bd45-f02a6f268a01" containerName="kserve-container" Apr 16 20:50:45.960263 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.960238 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:45.967949 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:45.967254 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:50:46.013889 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.013858 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-h782l\" (UID: \"754b806c-34a0-47dc-b441-9a35a965f52c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:46.115301 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.115276 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-h782l\" (UID: \"754b806c-34a0-47dc-b441-9a35a965f52c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:46.115618 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.115602 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location\") pod \"isvc-sklearn-predictor-5b8ffc6f57-h782l\" (UID: \"754b806c-34a0-47dc-b441-9a35a965f52c\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:46.272194 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.272136 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:46.393799 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.393772 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:50:46.396002 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:50:46.395972 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod754b806c_34a0_47dc_b441_9a35a965f52c.slice/crio-d39ef31c56449dd399b0fd9fe5c79e2e261167b2ca1ca068b92e8b6e9b47b168 WatchSource:0}: Error finding container d39ef31c56449dd399b0fd9fe5c79e2e261167b2ca1ca068b92e8b6e9b47b168: Status 404 returned error can't find the container with id d39ef31c56449dd399b0fd9fe5c79e2e261167b2ca1ca068b92e8b6e9b47b168 Apr 16 20:50:46.398200 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.398185 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:50:46.852976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.852937 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerStarted","Data":"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b"} Apr 16 20:50:46.852976 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:46.852976 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerStarted","Data":"d39ef31c56449dd399b0fd9fe5c79e2e261167b2ca1ca068b92e8b6e9b47b168"} Apr 16 20:50:48.507554 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.507531 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:50:48.536024 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.535993 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location\") pod \"84deeb0e-9eda-42d3-a2bd-c1021b014e13\" (UID: \"84deeb0e-9eda-42d3-a2bd-c1021b014e13\") " Apr 16 20:50:48.536285 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.536266 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84deeb0e-9eda-42d3-a2bd-c1021b014e13" (UID: "84deeb0e-9eda-42d3-a2bd-c1021b014e13"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:50:48.637369 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.637312 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84deeb0e-9eda-42d3-a2bd-c1021b014e13-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:50:48.859970 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.859931 2563 generic.go:358] "Generic (PLEG): container finished" podID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerID="688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5" exitCode=0 Apr 16 20:50:48.860126 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.859986 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerDied","Data":"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5"} Apr 16 20:50:48.860126 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.860012 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" event={"ID":"84deeb0e-9eda-42d3-a2bd-c1021b014e13","Type":"ContainerDied","Data":"207e0e8f1d3f537f7399757332de742806920a1a7526dc2ac521616fd2430868"} Apr 16 20:50:48.860126 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.860024 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb" Apr 16 20:50:48.860316 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.860029 2563 scope.go:117] "RemoveContainer" containerID="688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5" Apr 16 20:50:48.867984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.867964 2563 scope.go:117] "RemoveContainer" containerID="e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2" Apr 16 20:50:48.874751 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.874732 2563 scope.go:117] "RemoveContainer" containerID="688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5" Apr 16 20:50:48.874977 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:50:48.874959 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5\": container with ID starting with 688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5 not found: ID does not exist" containerID="688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5" Apr 16 20:50:48.875026 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.874984 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5"} err="failed to get container status \"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5\": rpc error: code = NotFound desc = could not find container \"688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5\": container with ID starting with 688c4cb1e599b455ab91163a882e5543be5cb42644cbc49444278176ddf872e5 not found: ID does not exist" Apr 16 20:50:48.875026 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.875001 2563 scope.go:117] "RemoveContainer" containerID="e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2" Apr 16 20:50:48.875236 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:50:48.875219 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2\": container with ID starting with e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2 not found: ID does not exist" containerID="e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2" Apr 16 20:50:48.875285 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.875239 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2"} err="failed to get container status \"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2\": rpc error: code = NotFound desc = could not find container \"e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2\": container with ID starting with e25bc853cec06ea8e931a47b6ef645a13b4b40a06c03208a0548d514e8f8c4c2 not found: ID does not exist" Apr 16 20:50:48.880427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.880407 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:50:48.884544 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:48.884524 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-k7jwb"] Apr 16 20:50:49.298961 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:49.298928 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" path="/var/lib/kubelet/pods/84deeb0e-9eda-42d3-a2bd-c1021b014e13/volumes" Apr 16 20:50:50.867868 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:50.867837 2563 generic.go:358] "Generic (PLEG): container finished" podID="754b806c-34a0-47dc-b441-9a35a965f52c" containerID="bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b" exitCode=0 Apr 16 20:50:50.868326 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:50.867914 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerDied","Data":"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b"} Apr 16 20:50:51.872150 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:51.872116 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerStarted","Data":"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8"} Apr 16 20:50:51.872578 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:51.872396 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:50:51.873729 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:51.873703 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:50:51.893835 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:51.893792 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podStartSLOduration=6.893779725 podStartE2EDuration="6.893779725s" podCreationTimestamp="2026-04-16 20:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:50:51.892201206 +0000 UTC m=+2349.181126992" watchObservedRunningTime="2026-04-16 20:50:51.893779725 +0000 UTC m=+2349.182705511" Apr 16 20:50:52.876071 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:50:52.876035 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:02.876401 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:02.876360 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:12.876103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:12.876062 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:22.876297 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:22.876254 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:32.877050 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:32.877010 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:42.876250 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:42.876190 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 20:51:52.877635 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:52.877603 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:51:56.068174 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.068138 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:51:56.068532 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.068456 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" containerID="cri-o://9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8" gracePeriod=30 Apr 16 20:51:56.123898 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.123868 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:51:56.124178 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.124164 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="storage-initializer" Apr 16 20:51:56.124178 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.124178 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="storage-initializer" Apr 16 20:51:56.124273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.124186 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" Apr 16 20:51:56.124273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.124192 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" Apr 16 20:51:56.124273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.124249 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="84deeb0e-9eda-42d3-a2bd-c1021b014e13" containerName="kserve-container" Apr 16 20:51:56.128047 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.128021 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:51:56.138102 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.138074 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:51:56.244790 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.244761 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-ngzrp\" (UID: \"713a56a3-69ae-49d8-a3a4-b9da400ab2e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:51:56.345753 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.345691 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-ngzrp\" (UID: \"713a56a3-69ae-49d8-a3a4-b9da400ab2e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:51:56.346001 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.345985 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-ngzrp\" (UID: \"713a56a3-69ae-49d8-a3a4-b9da400ab2e1\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:51:56.441129 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.441084 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:51:56.560484 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:56.560457 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:51:56.562281 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:51:56.562249 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod713a56a3_69ae_49d8_a3a4_b9da400ab2e1.slice/crio-0e2a74ed8e235b199f6ade35773e0fbed15046790e5bb7cd200b191810d31aeb WatchSource:0}: Error finding container 0e2a74ed8e235b199f6ade35773e0fbed15046790e5bb7cd200b191810d31aeb: Status 404 returned error can't find the container with id 0e2a74ed8e235b199f6ade35773e0fbed15046790e5bb7cd200b191810d31aeb Apr 16 20:51:57.080828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:57.080790 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerStarted","Data":"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3"} Apr 16 20:51:57.080828 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:57.080830 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerStarted","Data":"0e2a74ed8e235b199f6ade35773e0fbed15046790e5bb7cd200b191810d31aeb"} Apr 16 20:51:59.911965 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:51:59.911943 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:52:00.078400 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.078319 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location\") pod \"754b806c-34a0-47dc-b441-9a35a965f52c\" (UID: \"754b806c-34a0-47dc-b441-9a35a965f52c\") " Apr 16 20:52:00.078647 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.078625 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "754b806c-34a0-47dc-b441-9a35a965f52c" (UID: "754b806c-34a0-47dc-b441-9a35a965f52c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:52:00.092831 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.092801 2563 generic.go:358] "Generic (PLEG): container finished" podID="754b806c-34a0-47dc-b441-9a35a965f52c" containerID="9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8" exitCode=0 Apr 16 20:52:00.092950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.092863 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" Apr 16 20:52:00.092950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.092881 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerDied","Data":"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8"} Apr 16 20:52:00.092950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.092911 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l" event={"ID":"754b806c-34a0-47dc-b441-9a35a965f52c","Type":"ContainerDied","Data":"d39ef31c56449dd399b0fd9fe5c79e2e261167b2ca1ca068b92e8b6e9b47b168"} Apr 16 20:52:00.092950 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.092929 2563 scope.go:117] "RemoveContainer" containerID="9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8" Apr 16 20:52:00.100810 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.100793 2563 scope.go:117] "RemoveContainer" containerID="bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b" Apr 16 20:52:00.107795 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.107781 2563 scope.go:117] "RemoveContainer" containerID="9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8" Apr 16 20:52:00.108026 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:52:00.108006 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8\": container with ID starting with 9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8 not found: ID does not exist" containerID="9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8" Apr 16 20:52:00.108087 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.108035 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8"} err="failed to get container status \"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8\": rpc error: code = NotFound desc = could not find container \"9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8\": container with ID starting with 9a1b1da53cf309520d0d46abb0d3831027070509110eb52cc0a56c7db68a78e8 not found: ID does not exist" Apr 16 20:52:00.108087 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.108052 2563 scope.go:117] "RemoveContainer" containerID="bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b" Apr 16 20:52:00.108248 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:52:00.108233 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b\": container with ID starting with bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b not found: ID does not exist" containerID="bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b" Apr 16 20:52:00.108289 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.108253 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b"} err="failed to get container status \"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b\": rpc error: code = NotFound desc = could not find container \"bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b\": container with ID starting with bd427fad1e6ace91f9f841c78b2f29bcd9cd2f538603c1bff3368c3a8399b56b not found: ID does not exist" Apr 16 20:52:00.114437 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.114413 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:52:00.120199 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.120178 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-5b8ffc6f57-h782l"] Apr 16 20:52:00.179664 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:00.179641 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/754b806c-34a0-47dc-b441-9a35a965f52c-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:52:01.096924 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:01.096889 2563 generic.go:358] "Generic (PLEG): container finished" podID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerID="31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3" exitCode=0 Apr 16 20:52:01.097357 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:01.096970 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerDied","Data":"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3"} Apr 16 20:52:01.298836 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:01.298804 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" path="/var/lib/kubelet/pods/754b806c-34a0-47dc-b441-9a35a965f52c/volumes" Apr 16 20:52:02.102960 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:02.102875 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerStarted","Data":"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4"} Apr 16 20:52:02.103352 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:02.103231 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:52:33.130324 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:33.130274 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:52:43.108851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:43.108819 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:52:43.153150 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:43.153100 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" podStartSLOduration=47.153084262 podStartE2EDuration="47.153084262s" podCreationTimestamp="2026-04-16 20:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:52:02.122301142 +0000 UTC m=+2419.411226927" watchObservedRunningTime="2026-04-16 20:52:43.153084262 +0000 UTC m=+2460.442010049" Apr 16 20:52:46.238126 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.238089 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:52:46.238478 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.238361 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="kserve-container" containerID="cri-o://85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4" gracePeriod=30 Apr 16 20:52:46.323569 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.323525 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:52:46.324049 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.324034 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="storage-initializer" Apr 16 20:52:46.324103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.324054 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="storage-initializer" Apr 16 20:52:46.324103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.324073 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" Apr 16 20:52:46.324103 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.324082 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" Apr 16 20:52:46.324255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.324154 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="754b806c-34a0-47dc-b441-9a35a965f52c" containerName="kserve-container" Apr 16 20:52:46.327443 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.327423 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:46.334190 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.334164 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:52:46.441741 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.441708 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-qk9bt\" (UID: \"6609b24f-7c63-4d49-8b9f-bd3cc52154d1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:46.542273 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.542195 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-qk9bt\" (UID: \"6609b24f-7c63-4d49-8b9f-bd3cc52154d1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:46.542555 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.542534 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-59c87754d7-qk9bt\" (UID: \"6609b24f-7c63-4d49-8b9f-bd3cc52154d1\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:46.639901 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.639871 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:46.754520 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:46.754487 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:52:46.758405 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:52:46.758377 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6609b24f_7c63_4d49_8b9f_bd3cc52154d1.slice/crio-ffe5d50cbcdedd064ecbd2bdc644801e4da4b59d9fd0b3e1803ccfa4e5decc30 WatchSource:0}: Error finding container ffe5d50cbcdedd064ecbd2bdc644801e4da4b59d9fd0b3e1803ccfa4e5decc30: Status 404 returned error can't find the container with id ffe5d50cbcdedd064ecbd2bdc644801e4da4b59d9fd0b3e1803ccfa4e5decc30 Apr 16 20:52:47.256864 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:47.256825 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerStarted","Data":"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d"} Apr 16 20:52:47.256864 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:47.256867 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerStarted","Data":"ffe5d50cbcdedd064ecbd2bdc644801e4da4b59d9fd0b3e1803ccfa4e5decc30"} Apr 16 20:52:52.274539 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:52.274498 2563 generic.go:358] "Generic (PLEG): container finished" podID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerID="32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d" exitCode=0 Apr 16 20:52:52.275062 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:52.274542 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerDied","Data":"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d"} Apr 16 20:52:53.086967 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.086944 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:52:53.200759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.200676 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location\") pod \"713a56a3-69ae-49d8-a3a4-b9da400ab2e1\" (UID: \"713a56a3-69ae-49d8-a3a4-b9da400ab2e1\") " Apr 16 20:52:53.200997 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.200977 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "713a56a3-69ae-49d8-a3a4-b9da400ab2e1" (UID: "713a56a3-69ae-49d8-a3a4-b9da400ab2e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:52:53.279662 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.279628 2563 generic.go:358] "Generic (PLEG): container finished" podID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerID="85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4" exitCode=0 Apr 16 20:52:53.280091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.279707 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" Apr 16 20:52:53.280091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.279718 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerDied","Data":"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4"} Apr 16 20:52:53.280091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.279755 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp" event={"ID":"713a56a3-69ae-49d8-a3a4-b9da400ab2e1","Type":"ContainerDied","Data":"0e2a74ed8e235b199f6ade35773e0fbed15046790e5bb7cd200b191810d31aeb"} Apr 16 20:52:53.280091 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.279775 2563 scope.go:117] "RemoveContainer" containerID="85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4" Apr 16 20:52:53.281731 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.281706 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerStarted","Data":"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec"} Apr 16 20:52:53.282011 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.281995 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:52:53.283479 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.283454 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 20:52:53.288691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.288672 2563 scope.go:117] "RemoveContainer" containerID="31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3" Apr 16 20:52:53.295656 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.295637 2563 scope.go:117] "RemoveContainer" containerID="85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4" Apr 16 20:52:53.295925 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:52:53.295909 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4\": container with ID starting with 85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4 not found: ID does not exist" containerID="85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4" Apr 16 20:52:53.295986 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.295932 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4"} err="failed to get container status \"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4\": rpc error: code = NotFound desc = could not find container \"85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4\": container with ID starting with 85845175e97df4f47f2053b32f99869583c0e96a3f99b1eff5f4c4ae41eb46b4 not found: ID does not exist" Apr 16 20:52:53.295986 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.295951 2563 scope.go:117] "RemoveContainer" containerID="31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3" Apr 16 20:52:53.300664 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:52:53.300631 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3\": container with ID starting with 31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3 not found: ID does not exist" containerID="31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3" Apr 16 20:52:53.300757 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.300673 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3"} err="failed to get container status \"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3\": rpc error: code = NotFound desc = could not find container \"31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3\": container with ID starting with 31a4fb5446dc8c6f1f70f6772436a28bdab36c3b25740eeab506b8e8e77075e3 not found: ID does not exist" Apr 16 20:52:53.303172 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.303149 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/713a56a3-69ae-49d8-a3a4-b9da400ab2e1-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:52:53.304912 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.304871 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podStartSLOduration=7.304859696 podStartE2EDuration="7.304859696s" podCreationTimestamp="2026-04-16 20:52:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:52:53.303405171 +0000 UTC m=+2470.592330956" watchObservedRunningTime="2026-04-16 20:52:53.304859696 +0000 UTC m=+2470.593785480" Apr 16 20:52:53.315409 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.315387 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:52:53.319087 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:53.319065 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-ngzrp"] Apr 16 20:52:54.286599 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:54.286529 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 20:52:55.298077 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:52:55.298035 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" path="/var/lib/kubelet/pods/713a56a3-69ae-49d8-a3a4-b9da400ab2e1/volumes" Apr 16 20:53:04.287452 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:04.287411 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 20:53:14.287581 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:14.287531 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:53:23.448588 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.448546 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-59c87754d7-qk9bt_6609b24f-7c63-4d49-8b9f-bd3cc52154d1/kserve-container/0.log" Apr 16 20:53:23.583645 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.583611 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:53:23.583994 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.583962 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" containerID="cri-o://8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec" gracePeriod=30 Apr 16 20:53:23.651860 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.651832 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:53:23.652132 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.652121 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="kserve-container" Apr 16 20:53:23.652182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.652134 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="kserve-container" Apr 16 20:53:23.652182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.652149 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="storage-initializer" Apr 16 20:53:23.652182 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.652155 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="storage-initializer" Apr 16 20:53:23.652278 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.652216 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="713a56a3-69ae-49d8-a3a4-b9da400ab2e1" containerName="kserve-container" Apr 16 20:53:23.654951 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.654934 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:23.668873 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.668853 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:53:23.751655 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.751591 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k\" (UID: \"cd7cd998-0a69-48f8-93a4-276fcc29ce47\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:23.852605 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.852581 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k\" (UID: \"cd7cd998-0a69-48f8-93a4-276fcc29ce47\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:23.852926 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.852905 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k\" (UID: \"cd7cd998-0a69-48f8-93a4-276fcc29ce47\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:23.964470 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:23.964445 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:24.097533 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.097506 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:53:24.100510 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:53:24.100479 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7cd998_0a69_48f8_93a4_276fcc29ce47.slice/crio-210d668461e3222cb73d524e8f4b737406e0164481c90f53675e1b00ce6b8218 WatchSource:0}: Error finding container 210d668461e3222cb73d524e8f4b737406e0164481c90f53675e1b00ce6b8218: Status 404 returned error can't find the container with id 210d668461e3222cb73d524e8f4b737406e0164481c90f53675e1b00ce6b8218 Apr 16 20:53:24.287067 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.286975 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 20:53:24.389984 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.389952 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerStarted","Data":"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b"} Apr 16 20:53:24.390114 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.389994 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerStarted","Data":"210d668461e3222cb73d524e8f4b737406e0164481c90f53675e1b00ce6b8218"} Apr 16 20:53:24.519551 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.519524 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:53:24.659458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.659385 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location\") pod \"6609b24f-7c63-4d49-8b9f-bd3cc52154d1\" (UID: \"6609b24f-7c63-4d49-8b9f-bd3cc52154d1\") " Apr 16 20:53:24.696312 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.696274 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6609b24f-7c63-4d49-8b9f-bd3cc52154d1" (UID: "6609b24f-7c63-4d49-8b9f-bd3cc52154d1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:53:24.760858 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:24.760822 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6609b24f-7c63-4d49-8b9f-bd3cc52154d1-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:53:25.394482 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.394437 2563 generic.go:358] "Generic (PLEG): container finished" podID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerID="8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec" exitCode=0 Apr 16 20:53:25.394691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.394531 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" Apr 16 20:53:25.394691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.394582 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerDied","Data":"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec"} Apr 16 20:53:25.394691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.394621 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt" event={"ID":"6609b24f-7c63-4d49-8b9f-bd3cc52154d1","Type":"ContainerDied","Data":"ffe5d50cbcdedd064ecbd2bdc644801e4da4b59d9fd0b3e1803ccfa4e5decc30"} Apr 16 20:53:25.394691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.394638 2563 scope.go:117] "RemoveContainer" containerID="8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec" Apr 16 20:53:25.402316 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.402300 2563 scope.go:117] "RemoveContainer" containerID="32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d" Apr 16 20:53:25.409170 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.409154 2563 scope.go:117] "RemoveContainer" containerID="8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec" Apr 16 20:53:25.409425 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:53:25.409401 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec\": container with ID starting with 8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec not found: ID does not exist" containerID="8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec" Apr 16 20:53:25.409593 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.409433 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec"} err="failed to get container status \"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec\": rpc error: code = NotFound desc = could not find container \"8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec\": container with ID starting with 8e3535c4583fffb651da25de78ca3868bcbd049b13f2acecb30a1641ad78ceec not found: ID does not exist" Apr 16 20:53:25.409593 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.409456 2563 scope.go:117] "RemoveContainer" containerID="32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d" Apr 16 20:53:25.410267 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:53:25.410239 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d\": container with ID starting with 32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d not found: ID does not exist" containerID="32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d" Apr 16 20:53:25.410356 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.410272 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d"} err="failed to get container status \"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d\": rpc error: code = NotFound desc = could not find container \"32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d\": container with ID starting with 32d935982e29bd96333dbf1e57393cb96067fc5fa10d4fa5d6ca86b00f54452d not found: ID does not exist" Apr 16 20:53:25.411392 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.411375 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:53:25.414911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:25.414892 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-59c87754d7-qk9bt"] Apr 16 20:53:27.298285 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:27.298250 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" path="/var/lib/kubelet/pods/6609b24f-7c63-4d49-8b9f-bd3cc52154d1/volumes" Apr 16 20:53:28.406201 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:28.406171 2563 generic.go:358] "Generic (PLEG): container finished" podID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerID="9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b" exitCode=0 Apr 16 20:53:28.406596 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:28.406243 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerDied","Data":"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b"} Apr 16 20:53:29.411745 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:29.411713 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerStarted","Data":"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050"} Apr 16 20:53:29.412166 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:29.411925 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:53:29.430650 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:53:29.430601 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" podStartSLOduration=6.430589151 podStartE2EDuration="6.430589151s" podCreationTimestamp="2026-04-16 20:53:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:53:29.429450301 +0000 UTC m=+2506.718376085" watchObservedRunningTime="2026-04-16 20:53:29.430589151 +0000 UTC m=+2506.719514936" Apr 16 20:54:00.429520 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:00.429473 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 20:54:10.418284 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:10.418250 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:54:13.795071 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.795039 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:54:13.795458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.795287 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="kserve-container" containerID="cri-o://87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050" gracePeriod=30 Apr 16 20:54:13.860699 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.860662 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:54:13.860974 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.860961 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="storage-initializer" Apr 16 20:54:13.861018 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.860975 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="storage-initializer" Apr 16 20:54:13.861018 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.860994 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" Apr 16 20:54:13.861018 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.861000 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" Apr 16 20:54:13.861113 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.861053 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="6609b24f-7c63-4d49-8b9f-bd3cc52154d1" containerName="kserve-container" Apr 16 20:54:13.864323 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.864303 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:13.874111 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.874071 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:54:13.935658 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:13.935625 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-tf9bw\" (UID: \"67a3c0ff-b2fd-4500-988d-a215c2cbeb65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:14.036365 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.036324 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-tf9bw\" (UID: \"67a3c0ff-b2fd-4500-988d-a215c2cbeb65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:14.036699 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.036677 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-5bc96b6857-tf9bw\" (UID: \"67a3c0ff-b2fd-4500-988d-a215c2cbeb65\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:14.175173 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.175148 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:14.290989 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.290958 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:54:14.294092 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:54:14.294065 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67a3c0ff_b2fd_4500_988d_a215c2cbeb65.slice/crio-5effc0ef5196af49023b1537f48a63e768e484b270636403d299115ead24feaa WatchSource:0}: Error finding container 5effc0ef5196af49023b1537f48a63e768e484b270636403d299115ead24feaa: Status 404 returned error can't find the container with id 5effc0ef5196af49023b1537f48a63e768e484b270636403d299115ead24feaa Apr 16 20:54:14.567728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.567646 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerStarted","Data":"466b2faaacf6862fd1f1be5ffd03e38700a16d89628dd3857ac74b56b3c927be"} Apr 16 20:54:14.567728 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:14.567680 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerStarted","Data":"5effc0ef5196af49023b1537f48a63e768e484b270636403d299115ead24feaa"} Apr 16 20:54:18.581572 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:18.581522 2563 generic.go:358] "Generic (PLEG): container finished" podID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerID="466b2faaacf6862fd1f1be5ffd03e38700a16d89628dd3857ac74b56b3c927be" exitCode=0 Apr 16 20:54:18.581939 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:18.581601 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerDied","Data":"466b2faaacf6862fd1f1be5ffd03e38700a16d89628dd3857ac74b56b3c927be"} Apr 16 20:54:19.586800 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:19.586765 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerStarted","Data":"17fd20c617f6449101588c83728f1f95be229d3a28a97dec21f9d9b74796e392"} Apr 16 20:54:19.587180 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:19.587102 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:54:19.588456 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:19.588428 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:54:19.605034 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:19.604993 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podStartSLOduration=6.604982059 podStartE2EDuration="6.604982059s" podCreationTimestamp="2026-04-16 20:54:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:54:19.603655106 +0000 UTC m=+2556.892580890" watchObservedRunningTime="2026-04-16 20:54:19.604982059 +0000 UTC m=+2556.893907834" Apr 16 20:54:20.235472 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.235448 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:54:20.285911 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.285853 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location\") pod \"cd7cd998-0a69-48f8-93a4-276fcc29ce47\" (UID: \"cd7cd998-0a69-48f8-93a4-276fcc29ce47\") " Apr 16 20:54:20.286145 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.286124 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cd7cd998-0a69-48f8-93a4-276fcc29ce47" (UID: "cd7cd998-0a69-48f8-93a4-276fcc29ce47"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:54:20.386897 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.386858 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cd7cd998-0a69-48f8-93a4-276fcc29ce47-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:54:20.590963 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.590872 2563 generic.go:358] "Generic (PLEG): container finished" podID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerID="87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050" exitCode=0 Apr 16 20:54:20.590963 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.590941 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" Apr 16 20:54:20.591449 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.590958 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerDied","Data":"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050"} Apr 16 20:54:20.591449 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.590992 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k" event={"ID":"cd7cd998-0a69-48f8-93a4-276fcc29ce47","Type":"ContainerDied","Data":"210d668461e3222cb73d524e8f4b737406e0164481c90f53675e1b00ce6b8218"} Apr 16 20:54:20.591449 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.591012 2563 scope.go:117] "RemoveContainer" containerID="87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050" Apr 16 20:54:20.591638 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.591605 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:54:20.599513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.599498 2563 scope.go:117] "RemoveContainer" containerID="9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b" Apr 16 20:54:20.606205 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.606186 2563 scope.go:117] "RemoveContainer" containerID="87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050" Apr 16 20:54:20.606447 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:54:20.606429 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050\": container with ID starting with 87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050 not found: ID does not exist" containerID="87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050" Apr 16 20:54:20.606513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.606454 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050"} err="failed to get container status \"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050\": rpc error: code = NotFound desc = could not find container \"87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050\": container with ID starting with 87a49a5d00beda47f4c78be37a58b425185d24deaf2b9ea3c245082b87163050 not found: ID does not exist" Apr 16 20:54:20.606513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.606471 2563 scope.go:117] "RemoveContainer" containerID="9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b" Apr 16 20:54:20.606766 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:54:20.606749 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b\": container with ID starting with 9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b not found: ID does not exist" containerID="9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b" Apr 16 20:54:20.606808 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.606769 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b"} err="failed to get container status \"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b\": rpc error: code = NotFound desc = could not find container \"9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b\": container with ID starting with 9d3d21cf78f43355c48811fe4ae98afe7dea61e640bf2fa0a5c175630390f67b not found: ID does not exist" Apr 16 20:54:20.612710 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.612686 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:54:20.614639 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:20.614617 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-8w22k"] Apr 16 20:54:21.298291 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:21.298261 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" path="/var/lib/kubelet/pods/cd7cd998-0a69-48f8-93a4-276fcc29ce47/volumes" Apr 16 20:54:30.592458 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:30.592412 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:54:40.591812 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:40.591773 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:54:47.415017 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:47.414984 2563 scope.go:117] "RemoveContainer" containerID="1a2a87f6bde94561b788e1701fcad265e586a6c96a247e8eaf46768456aca3d7" Apr 16 20:54:47.422477 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:47.422457 2563 scope.go:117] "RemoveContainer" containerID="4135b5d97e0fc04204c3a9a4f8cbaf9aee7e62bbfc992b5939423e775c7865ac" Apr 16 20:54:50.592407 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:54:50.592365 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:55:00.592021 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:00.591983 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:55:10.591816 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:10.591776 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 20:55:20.592727 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:20.592696 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:55:24.068810 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.068771 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:55:24.069162 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.069029 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" containerID="cri-o://17fd20c617f6449101588c83728f1f95be229d3a28a97dec21f9d9b74796e392" gracePeriod=30 Apr 16 20:55:24.111597 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111546 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:55:24.111872 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111860 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="kserve-container" Apr 16 20:55:24.111914 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111873 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="kserve-container" Apr 16 20:55:24.111914 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111893 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="storage-initializer" Apr 16 20:55:24.111914 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111899 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="storage-initializer" Apr 16 20:55:24.112041 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.111951 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="cd7cd998-0a69-48f8-93a4-276fcc29ce47" containerName="kserve-container" Apr 16 20:55:24.114977 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.114962 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:24.122620 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.122598 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:55:24.160252 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.160223 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82\" (UID: \"224e3b19-cff9-44e9-9ced-ab562fc11d75\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:24.260739 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.260711 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82\" (UID: \"224e3b19-cff9-44e9-9ced-ab562fc11d75\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:24.261137 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.261116 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82\" (UID: \"224e3b19-cff9-44e9-9ced-ab562fc11d75\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:24.427136 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.427089 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:24.543860 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.543831 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:55:24.545949 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:55:24.545910 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod224e3b19_cff9_44e9_9ced_ab562fc11d75.slice/crio-0c521a5347dc8e465a28b40dd93bae98d8528461d278f91acd1186e973477013 WatchSource:0}: Error finding container 0c521a5347dc8e465a28b40dd93bae98d8528461d278f91acd1186e973477013: Status 404 returned error can't find the container with id 0c521a5347dc8e465a28b40dd93bae98d8528461d278f91acd1186e973477013 Apr 16 20:55:24.788805 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.788711 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerStarted","Data":"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d"} Apr 16 20:55:24.788805 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:24.788755 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerStarted","Data":"0c521a5347dc8e465a28b40dd93bae98d8528461d278f91acd1186e973477013"} Apr 16 20:55:27.799207 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:27.799178 2563 generic.go:358] "Generic (PLEG): container finished" podID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerID="17fd20c617f6449101588c83728f1f95be229d3a28a97dec21f9d9b74796e392" exitCode=0 Apr 16 20:55:27.799513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:27.799244 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerDied","Data":"17fd20c617f6449101588c83728f1f95be229d3a28a97dec21f9d9b74796e392"} Apr 16 20:55:27.900735 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:27.900713 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:55:27.985316 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:27.985289 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location\") pod \"67a3c0ff-b2fd-4500-988d-a215c2cbeb65\" (UID: \"67a3c0ff-b2fd-4500-988d-a215c2cbeb65\") " Apr 16 20:55:27.985611 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:27.985591 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "67a3c0ff-b2fd-4500-988d-a215c2cbeb65" (UID: "67a3c0ff-b2fd-4500-988d-a215c2cbeb65"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:55:28.086369 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.086313 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/67a3c0ff-b2fd-4500-988d-a215c2cbeb65-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:55:28.804105 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.804061 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" event={"ID":"67a3c0ff-b2fd-4500-988d-a215c2cbeb65","Type":"ContainerDied","Data":"5effc0ef5196af49023b1537f48a63e768e484b270636403d299115ead24feaa"} Apr 16 20:55:28.804502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.804113 2563 scope.go:117] "RemoveContainer" containerID="17fd20c617f6449101588c83728f1f95be229d3a28a97dec21f9d9b74796e392" Apr 16 20:55:28.804502 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.804119 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw" Apr 16 20:55:28.805487 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.805434 2563 generic.go:358] "Generic (PLEG): container finished" podID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerID="28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d" exitCode=0 Apr 16 20:55:28.805542 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.805481 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerDied","Data":"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d"} Apr 16 20:55:28.815488 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.815236 2563 scope.go:117] "RemoveContainer" containerID="466b2faaacf6862fd1f1be5ffd03e38700a16d89628dd3857ac74b56b3c927be" Apr 16 20:55:28.847151 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.847128 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:55:28.852314 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:28.852291 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-5bc96b6857-tf9bw"] Apr 16 20:55:29.298411 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:29.298378 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" path="/var/lib/kubelet/pods/67a3c0ff-b2fd-4500-988d-a215c2cbeb65/volumes" Apr 16 20:55:29.811703 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:29.811667 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerStarted","Data":"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb"} Apr 16 20:55:29.812098 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:29.811951 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:55:29.813244 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:29.813220 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:55:29.833917 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:29.833878 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podStartSLOduration=5.833866102 podStartE2EDuration="5.833866102s" podCreationTimestamp="2026-04-16 20:55:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:55:29.832617316 +0000 UTC m=+2627.121543102" watchObservedRunningTime="2026-04-16 20:55:29.833866102 +0000 UTC m=+2627.122791887" Apr 16 20:55:30.815440 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:30.815405 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:55:40.815739 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:40.815700 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:55:50.815949 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:55:50.815909 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:56:00.815927 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:00.815882 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:56:10.816066 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:10.816026 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:56:20.815462 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:20.815414 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:56:30.816136 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:30.816096 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 20:56:40.816575 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:40.816478 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:56:44.264650 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.264613 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:56:44.265105 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.264963 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" containerID="cri-o://f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb" gracePeriod=30 Apr 16 20:56:44.325385 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325359 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:56:44.325720 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325707 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" Apr 16 20:56:44.325771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325722 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" Apr 16 20:56:44.325771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325742 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="storage-initializer" Apr 16 20:56:44.325771 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325747 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="storage-initializer" Apr 16 20:56:44.325892 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.325808 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="67a3c0ff-b2fd-4500-988d-a215c2cbeb65" containerName="kserve-container" Apr 16 20:56:44.328759 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.328741 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:44.339693 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.339672 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:56:44.463098 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.463066 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cgz2l\" (UID: \"c0b4e426-7a87-47b4-bed7-c9b595a7a30c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:44.564271 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.564192 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cgz2l\" (UID: \"c0b4e426-7a87-47b4-bed7-c9b595a7a30c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:44.564572 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.564536 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-cgz2l\" (UID: \"c0b4e426-7a87-47b4-bed7-c9b595a7a30c\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:44.638909 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.638887 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:44.760616 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.760543 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:56:44.763257 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:56:44.763229 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b4e426_7a87_47b4_bed7_c9b595a7a30c.slice/crio-32f9624b619d8b443e2a523a23c2d7ab45191151f2d650447bcd5787a881ccab WatchSource:0}: Error finding container 32f9624b619d8b443e2a523a23c2d7ab45191151f2d650447bcd5787a881ccab: Status 404 returned error can't find the container with id 32f9624b619d8b443e2a523a23c2d7ab45191151f2d650447bcd5787a881ccab Apr 16 20:56:44.765073 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:44.765055 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 20:56:45.042760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:45.042724 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerStarted","Data":"7d1d52b3936fca8f44b042988f8d2c0faea0ab6822d94f49a234a5f22b00ba9f"} Apr 16 20:56:45.042760 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:45.042764 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerStarted","Data":"32f9624b619d8b443e2a523a23c2d7ab45191151f2d650447bcd5787a881ccab"} Apr 16 20:56:48.313442 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:48.313418 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:56:48.391441 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:48.391377 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location\") pod \"224e3b19-cff9-44e9-9ced-ab562fc11d75\" (UID: \"224e3b19-cff9-44e9-9ced-ab562fc11d75\") " Apr 16 20:56:48.391691 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:48.391669 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "224e3b19-cff9-44e9-9ced-ab562fc11d75" (UID: "224e3b19-cff9-44e9-9ced-ab562fc11d75"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:56:48.492255 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:48.492223 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/224e3b19-cff9-44e9-9ced-ab562fc11d75-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:56:49.057238 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.057203 2563 generic.go:358] "Generic (PLEG): container finished" podID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerID="f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb" exitCode=0 Apr 16 20:56:49.057426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.057280 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" Apr 16 20:56:49.057426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.057285 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerDied","Data":"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb"} Apr 16 20:56:49.057426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.057328 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82" event={"ID":"224e3b19-cff9-44e9-9ced-ab562fc11d75","Type":"ContainerDied","Data":"0c521a5347dc8e465a28b40dd93bae98d8528461d278f91acd1186e973477013"} Apr 16 20:56:49.057426 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.057348 2563 scope.go:117] "RemoveContainer" containerID="f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb" Apr 16 20:56:49.069743 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.069666 2563 scope.go:117] "RemoveContainer" containerID="28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d" Apr 16 20:56:49.076707 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.076686 2563 scope.go:117] "RemoveContainer" containerID="f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb" Apr 16 20:56:49.076959 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:56:49.076937 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb\": container with ID starting with f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb not found: ID does not exist" containerID="f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb" Apr 16 20:56:49.077007 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.076968 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb"} err="failed to get container status \"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb\": rpc error: code = NotFound desc = could not find container \"f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb\": container with ID starting with f81a0bf8f12fc794949730e9750af25ed200a2bb5e148d51b14c8a0ba7588dbb not found: ID does not exist" Apr 16 20:56:49.077007 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.076991 2563 scope.go:117] "RemoveContainer" containerID="28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d" Apr 16 20:56:49.077222 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:56:49.077203 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d\": container with ID starting with 28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d not found: ID does not exist" containerID="28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d" Apr 16 20:56:49.077263 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.077230 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d"} err="failed to get container status \"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d\": rpc error: code = NotFound desc = could not find container \"28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d\": container with ID starting with 28be2fc509cac689489339341418fd6261270a723cb550816c5d7db5c584e86d not found: ID does not exist" Apr 16 20:56:49.080659 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.080639 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:56:49.083735 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.083715 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-595976f8b-2pg82"] Apr 16 20:56:49.298346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:49.298314 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" path="/var/lib/kubelet/pods/224e3b19-cff9-44e9-9ced-ab562fc11d75/volumes" Apr 16 20:56:50.062853 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:50.062820 2563 generic.go:358] "Generic (PLEG): container finished" podID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerID="7d1d52b3936fca8f44b042988f8d2c0faea0ab6822d94f49a234a5f22b00ba9f" exitCode=0 Apr 16 20:56:50.063215 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:50.062871 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerDied","Data":"7d1d52b3936fca8f44b042988f8d2c0faea0ab6822d94f49a234a5f22b00ba9f"} Apr 16 20:56:54.079635 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:54.079601 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerStarted","Data":"8038cdf160ca79c4508ec6383e8c5045f403f533c3e29337a5f35861de379cdd"} Apr 16 20:56:54.080014 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:54.079878 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:56:54.081324 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:54.081297 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 20:56:54.097503 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:54.097461 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" podStartSLOduration=6.52258554 podStartE2EDuration="10.09745076s" podCreationTimestamp="2026-04-16 20:56:44 +0000 UTC" firstStartedPulling="2026-04-16 20:56:50.063938205 +0000 UTC m=+2707.352863967" lastFinishedPulling="2026-04-16 20:56:53.63880341 +0000 UTC m=+2710.927729187" observedRunningTime="2026-04-16 20:56:54.095925781 +0000 UTC m=+2711.384851557" watchObservedRunningTime="2026-04-16 20:56:54.09745076 +0000 UTC m=+2711.386376545" Apr 16 20:56:55.083814 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:56:55.083770 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 20:57:05.084652 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:05.084619 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:57:23.954972 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:23.954938 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:57:23.955454 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:23.955198 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" containerID="cri-o://8038cdf160ca79c4508ec6383e8c5045f403f533c3e29337a5f35861de379cdd" gracePeriod=30 Apr 16 20:57:24.048237 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048204 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:57:24.048693 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048668 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" Apr 16 20:57:24.048693 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048687 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" Apr 16 20:57:24.048851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048705 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="storage-initializer" Apr 16 20:57:24.048851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048714 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="storage-initializer" Apr 16 20:57:24.048851 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.048831 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="224e3b19-cff9-44e9-9ced-ab562fc11d75" containerName="kserve-container" Apr 16 20:57:24.051630 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.051615 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:24.059958 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.059935 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:57:24.169427 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.169398 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx\" (UID: \"eab6cc60-414c-40d4-b3e8-5e00f851dbf3\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:24.270606 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.270538 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx\" (UID: \"eab6cc60-414c-40d4-b3e8-5e00f851dbf3\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:24.270955 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.270934 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx\" (UID: \"eab6cc60-414c-40d4-b3e8-5e00f851dbf3\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:24.362788 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.362761 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:24.480878 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:24.480848 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:57:24.483952 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:57:24.483923 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeab6cc60_414c_40d4_b3e8_5e00f851dbf3.slice/crio-d43bcad71bad087d40f5dfb9c32dafa9f4c4359a209ee82f2a68e479f87ee32e WatchSource:0}: Error finding container d43bcad71bad087d40f5dfb9c32dafa9f4c4359a209ee82f2a68e479f87ee32e: Status 404 returned error can't find the container with id d43bcad71bad087d40f5dfb9c32dafa9f4c4359a209ee82f2a68e479f87ee32e Apr 16 20:57:25.187463 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:25.187420 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerStarted","Data":"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a"} Apr 16 20:57:25.187463 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:25.187463 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerStarted","Data":"d43bcad71bad087d40f5dfb9c32dafa9f4c4359a209ee82f2a68e479f87ee32e"} Apr 16 20:57:29.201307 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:29.201228 2563 generic.go:358] "Generic (PLEG): container finished" podID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerID="bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a" exitCode=0 Apr 16 20:57:29.201668 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:29.201301 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerDied","Data":"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a"} Apr 16 20:57:30.206236 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:30.206201 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerStarted","Data":"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc"} Apr 16 20:57:30.206690 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:30.206494 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:30.207522 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:30.207495 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.59:8080: connect: connection refused" Apr 16 20:57:30.223882 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:30.223841 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" podStartSLOduration=6.223831158 podStartE2EDuration="6.223831158s" podCreationTimestamp="2026-04-16 20:57:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 20:57:30.222270049 +0000 UTC m=+2747.511195836" watchObservedRunningTime="2026-04-16 20:57:30.223831158 +0000 UTC m=+2747.512756943" Apr 16 20:57:31.210147 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:31.210109 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.59:8080: connect: connection refused" Apr 16 20:57:41.211438 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:41.211412 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:57:53.558814 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.558779 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:57:53.559164 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.559054 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" containerID="cri-o://8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc" gracePeriod=30 Apr 16 20:57:53.630189 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.630159 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 20:57:53.632661 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.632645 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:57:53.641110 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.641088 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 20:57:53.699933 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.699906 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjcgh\" (UID: \"b19defbd-c110-456f-8448-412d9f791bce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:57:53.800503 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.800472 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjcgh\" (UID: \"b19defbd-c110-456f-8448-412d9f791bce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:57:53.800849 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.800829 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-jjcgh\" (UID: \"b19defbd-c110-456f-8448-412d9f791bce\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:57:53.944700 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:53.944676 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:57:54.078551 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.078450 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 20:57:54.081795 ip-10-0-138-118 kubenswrapper[2563]: W0416 20:57:54.081763 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb19defbd_c110_456f_8448_412d9f791bce.slice/crio-f5c41beb0a5febec4abdcb0b4c23ec40b8846f7137473ec329074d0459001939 WatchSource:0}: Error finding container f5c41beb0a5febec4abdcb0b4c23ec40b8846f7137473ec329074d0459001939: Status 404 returned error can't find the container with id f5c41beb0a5febec4abdcb0b4c23ec40b8846f7137473ec329074d0459001939 Apr 16 20:57:54.283876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.283785 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerStarted","Data":"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54"} Apr 16 20:57:54.283876 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.283831 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerStarted","Data":"f5c41beb0a5febec4abdcb0b4c23ec40b8846f7137473ec329074d0459001939"} Apr 16 20:57:54.285404 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.285377 2563 generic.go:358] "Generic (PLEG): container finished" podID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerID="8038cdf160ca79c4508ec6383e8c5045f403f533c3e29337a5f35861de379cdd" exitCode=137 Apr 16 20:57:54.285513 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.285410 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerDied","Data":"8038cdf160ca79c4508ec6383e8c5045f403f533c3e29337a5f35861de379cdd"} Apr 16 20:57:54.575825 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.575802 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:57:54.605681 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.605650 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location\") pod \"c0b4e426-7a87-47b4-bed7-c9b595a7a30c\" (UID: \"c0b4e426-7a87-47b4-bed7-c9b595a7a30c\") " Apr 16 20:57:54.619617 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.619589 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c0b4e426-7a87-47b4-bed7-c9b595a7a30c" (UID: "c0b4e426-7a87-47b4-bed7-c9b595a7a30c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:57:54.706708 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:54.706677 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c0b4e426-7a87-47b4-bed7-c9b595a7a30c-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:57:55.290608 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.290521 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" Apr 16 20:57:55.290608 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.290533 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l" event={"ID":"c0b4e426-7a87-47b4-bed7-c9b595a7a30c","Type":"ContainerDied","Data":"32f9624b619d8b443e2a523a23c2d7ab45191151f2d650447bcd5787a881ccab"} Apr 16 20:57:55.290797 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.290611 2563 scope.go:117] "RemoveContainer" containerID="8038cdf160ca79c4508ec6383e8c5045f403f533c3e29337a5f35861de379cdd" Apr 16 20:57:55.299361 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.299224 2563 scope.go:117] "RemoveContainer" containerID="7d1d52b3936fca8f44b042988f8d2c0faea0ab6822d94f49a234a5f22b00ba9f" Apr 16 20:57:55.313420 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.313399 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:57:55.316378 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:55.316360 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-cgz2l"] Apr 16 20:57:57.298735 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:57.298701 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" path="/var/lib/kubelet/pods/c0b4e426-7a87-47b4-bed7-c9b595a7a30c/volumes" Apr 16 20:57:58.304223 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:58.304189 2563 generic.go:358] "Generic (PLEG): container finished" podID="b19defbd-c110-456f-8448-412d9f791bce" containerID="7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54" exitCode=0 Apr 16 20:57:58.304591 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:57:58.304264 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerDied","Data":"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54"} Apr 16 20:58:24.257584 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.257500 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:58:24.376490 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.376299 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location\") pod \"eab6cc60-414c-40d4-b3e8-5e00f851dbf3\" (UID: \"eab6cc60-414c-40d4-b3e8-5e00f851dbf3\") " Apr 16 20:58:24.386920 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.386890 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eab6cc60-414c-40d4-b3e8-5e00f851dbf3" (UID: "eab6cc60-414c-40d4-b3e8-5e00f851dbf3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 20:58:24.418788 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.418677 2563 generic.go:358] "Generic (PLEG): container finished" podID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerID="8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc" exitCode=137 Apr 16 20:58:24.418788 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.418776 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" Apr 16 20:58:24.418980 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.418810 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerDied","Data":"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc"} Apr 16 20:58:24.418980 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.418836 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx" event={"ID":"eab6cc60-414c-40d4-b3e8-5e00f851dbf3","Type":"ContainerDied","Data":"d43bcad71bad087d40f5dfb9c32dafa9f4c4359a209ee82f2a68e479f87ee32e"} Apr 16 20:58:24.418980 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.418857 2563 scope.go:117] "RemoveContainer" containerID="8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc" Apr 16 20:58:24.429314 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.429292 2563 scope.go:117] "RemoveContainer" containerID="bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a" Apr 16 20:58:24.438764 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.438728 2563 scope.go:117] "RemoveContainer" containerID="8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc" Apr 16 20:58:24.439059 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:58:24.439024 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc\": container with ID starting with 8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc not found: ID does not exist" containerID="8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc" Apr 16 20:58:24.439184 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.439058 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc"} err="failed to get container status \"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc\": rpc error: code = NotFound desc = could not find container \"8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc\": container with ID starting with 8d1a3d374e6d7b9aa05f1e967ad7daccc63691b7b9282965600d509ba3fd73bc not found: ID does not exist" Apr 16 20:58:24.439184 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.439083 2563 scope.go:117] "RemoveContainer" containerID="bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a" Apr 16 20:58:24.439350 ip-10-0-138-118 kubenswrapper[2563]: E0416 20:58:24.439324 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a\": container with ID starting with bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a not found: ID does not exist" containerID="bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a" Apr 16 20:58:24.439471 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.439358 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a"} err="failed to get container status \"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a\": rpc error: code = NotFound desc = could not find container \"bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a\": container with ID starting with bea75406a350a1995a77f0078edfe5f99d9ac8a7e85851182ea2b7cb0edad32a not found: ID does not exist" Apr 16 20:58:24.444240 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.444217 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:58:24.447632 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.447609 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-zcnqx"] Apr 16 20:58:24.477346 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:24.477325 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eab6cc60-414c-40d4-b3e8-5e00f851dbf3-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 20:58:25.300778 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:58:25.300742 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" path="/var/lib/kubelet/pods/eab6cc60-414c-40d4-b3e8-5e00f851dbf3/volumes" Apr 16 20:59:52.723500 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:59:52.723459 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerStarted","Data":"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff"} Apr 16 20:59:52.723986 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:59:52.723610 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 20:59:52.724887 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:59:52.724864 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 16 20:59:52.740635 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:59:52.740541 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" podStartSLOduration=5.8943160169999995 podStartE2EDuration="1m59.740528277s" podCreationTimestamp="2026-04-16 20:57:53 +0000 UTC" firstStartedPulling="2026-04-16 20:57:58.305360419 +0000 UTC m=+2775.594286182" lastFinishedPulling="2026-04-16 20:59:52.151572674 +0000 UTC m=+2889.440498442" observedRunningTime="2026-04-16 20:59:52.739863539 +0000 UTC m=+2890.028789324" watchObservedRunningTime="2026-04-16 20:59:52.740528277 +0000 UTC m=+2890.029454067" Apr 16 20:59:53.727218 ip-10-0-138-118 kubenswrapper[2563]: I0416 20:59:53.727182 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.60:8080: connect: connection refused" Apr 16 21:00:03.728874 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:03.728847 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 21:00:15.195083 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.195052 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 21:00:15.195515 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.195300 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" containerID="cri-o://82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff" gracePeriod=30 Apr 16 21:00:15.385145 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385115 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:00:15.385429 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385417 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="storage-initializer" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385430 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="storage-initializer" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385441 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385448 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385455 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385461 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385470 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="storage-initializer" Apr 16 21:00:15.385488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385476 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="storage-initializer" Apr 16 21:00:15.385715 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385526 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0b4e426-7a87-47b4-bed7-c9b595a7a30c" containerName="kserve-container" Apr 16 21:00:15.385715 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.385537 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="eab6cc60-414c-40d4-b3e8-5e00f851dbf3" containerName="kserve-container" Apr 16 21:00:15.389487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.389468 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:15.395693 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.395669 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:00:15.512534 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.512455 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-2gxjl\" (UID: \"cb118591-76e0-4ea1-8add-14cb4746493f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:15.613134 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.613104 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-2gxjl\" (UID: \"cb118591-76e0-4ea1-8add-14cb4746493f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:15.613445 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.613428 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-2gxjl\" (UID: \"cb118591-76e0-4ea1-8add-14cb4746493f\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:15.700659 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.700629 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:15.820967 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:15.820941 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:00:15.823506 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:00:15.823478 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb118591_76e0_4ea1_8add_14cb4746493f.slice/crio-8b78dfa6bc0f4e3b3ed5bded168cafd023b1be0881a486e2460d4bb309180fbd WatchSource:0}: Error finding container 8b78dfa6bc0f4e3b3ed5bded168cafd023b1be0881a486e2460d4bb309180fbd: Status 404 returned error can't find the container with id 8b78dfa6bc0f4e3b3ed5bded168cafd023b1be0881a486e2460d4bb309180fbd Apr 16 21:00:16.794735 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:16.794698 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerStarted","Data":"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83"} Apr 16 21:00:16.794735 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:16.794739 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerStarted","Data":"8b78dfa6bc0f4e3b3ed5bded168cafd023b1be0881a486e2460d4bb309180fbd"} Apr 16 21:00:17.733936 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.733911 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 21:00:17.799948 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.799878 2563 generic.go:358] "Generic (PLEG): container finished" podID="b19defbd-c110-456f-8448-412d9f791bce" containerID="82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff" exitCode=0 Apr 16 21:00:17.800274 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.799949 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" Apr 16 21:00:17.800274 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.799966 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerDied","Data":"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff"} Apr 16 21:00:17.800274 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.800006 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh" event={"ID":"b19defbd-c110-456f-8448-412d9f791bce","Type":"ContainerDied","Data":"f5c41beb0a5febec4abdcb0b4c23ec40b8846f7137473ec329074d0459001939"} Apr 16 21:00:17.800274 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.800021 2563 scope.go:117] "RemoveContainer" containerID="82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff" Apr 16 21:00:17.807390 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.807373 2563 scope.go:117] "RemoveContainer" containerID="7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54" Apr 16 21:00:17.813987 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.813973 2563 scope.go:117] "RemoveContainer" containerID="82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff" Apr 16 21:00:17.814223 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:00:17.814206 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff\": container with ID starting with 82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff not found: ID does not exist" containerID="82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff" Apr 16 21:00:17.814279 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.814233 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff"} err="failed to get container status \"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff\": rpc error: code = NotFound desc = could not find container \"82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff\": container with ID starting with 82d459eda15a34c1ad0fb1f490e29fcf962ef47752b2d474cb14d6ecfe4c9cff not found: ID does not exist" Apr 16 21:00:17.814279 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.814250 2563 scope.go:117] "RemoveContainer" containerID="7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54" Apr 16 21:00:17.814496 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:00:17.814479 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54\": container with ID starting with 7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54 not found: ID does not exist" containerID="7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54" Apr 16 21:00:17.814550 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.814506 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54"} err="failed to get container status \"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54\": rpc error: code = NotFound desc = could not find container \"7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54\": container with ID starting with 7c099154c8acd3ecf4381f47efe930503abdfb8369fd34a08b5482ae0b324a54 not found: ID does not exist" Apr 16 21:00:17.831820 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.831801 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location\") pod \"b19defbd-c110-456f-8448-412d9f791bce\" (UID: \"b19defbd-c110-456f-8448-412d9f791bce\") " Apr 16 21:00:17.832118 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.832100 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b19defbd-c110-456f-8448-412d9f791bce" (UID: "b19defbd-c110-456f-8448-412d9f791bce"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:00:17.932373 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:17.932343 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b19defbd-c110-456f-8448-412d9f791bce-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:00:18.121178 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:18.121150 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 21:00:18.124512 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:18.124489 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-jjcgh"] Apr 16 21:00:19.298911 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:19.298879 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19defbd-c110-456f-8448-412d9f791bce" path="/var/lib/kubelet/pods/b19defbd-c110-456f-8448-412d9f791bce/volumes" Apr 16 21:00:19.808601 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:19.808549 2563 generic.go:358] "Generic (PLEG): container finished" podID="cb118591-76e0-4ea1-8add-14cb4746493f" containerID="4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83" exitCode=0 Apr 16 21:00:19.808601 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:19.808595 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerDied","Data":"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83"} Apr 16 21:00:39.883036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:39.883000 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerStarted","Data":"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781"} Apr 16 21:00:39.883450 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:39.883293 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:00:39.884430 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:39.884403 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:00:39.900119 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:39.900078 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podStartSLOduration=5.736198936 podStartE2EDuration="24.900066673s" podCreationTimestamp="2026-04-16 21:00:15 +0000 UTC" firstStartedPulling="2026-04-16 21:00:19.809798402 +0000 UTC m=+2917.098724165" lastFinishedPulling="2026-04-16 21:00:38.973666121 +0000 UTC m=+2936.262591902" observedRunningTime="2026-04-16 21:00:39.897900543 +0000 UTC m=+2937.186826328" watchObservedRunningTime="2026-04-16 21:00:39.900066673 +0000 UTC m=+2937.188992520" Apr 16 21:00:40.886615 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:40.886578 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:00:50.887144 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:00:50.887101 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:01:00.886735 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:00.886694 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:01:10.887052 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:10.886967 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:01:20.887378 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:20.887339 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:01:30.887529 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:30.887487 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 21:01:40.887626 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:40.887598 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:01:45.530705 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.530670 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:01:45.531121 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.530913 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" containerID="cri-o://5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781" gracePeriod=30 Apr 16 21:01:45.670996 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.670957 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:01:45.671435 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.671421 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="storage-initializer" Apr 16 21:01:45.671480 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.671439 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="storage-initializer" Apr 16 21:01:45.671480 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.671451 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" Apr 16 21:01:45.671480 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.671460 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" Apr 16 21:01:45.671587 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.671551 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b19defbd-c110-456f-8448-412d9f791bce" containerName="kserve-container" Apr 16 21:01:45.674973 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.674948 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:01:45.690586 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.690533 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:01:45.728135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.728103 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94\" (UID: \"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:01:45.831787 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.831698 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94\" (UID: \"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:01:45.832054 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.832029 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94\" (UID: \"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:01:45.985108 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:45.985077 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:01:46.140807 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:46.140778 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:01:46.142717 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:01:46.142689 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d4be8b7_26e4_4c95_9d57_9cbc5c57e4fc.slice/crio-0e3fa23c85d9e68632d81c954fc09eb805b97704f0f225054717ef9dc8721f0d WatchSource:0}: Error finding container 0e3fa23c85d9e68632d81c954fc09eb805b97704f0f225054717ef9dc8721f0d: Status 404 returned error can't find the container with id 0e3fa23c85d9e68632d81c954fc09eb805b97704f0f225054717ef9dc8721f0d Apr 16 21:01:46.144781 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:46.144764 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:01:47.093423 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:47.093384 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerStarted","Data":"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc"} Apr 16 21:01:47.093423 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:47.093420 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerStarted","Data":"0e3fa23c85d9e68632d81c954fc09eb805b97704f0f225054717ef9dc8721f0d"} Apr 16 21:01:48.876337 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:48.876307 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:01:48.959204 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:48.959171 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location\") pod \"cb118591-76e0-4ea1-8add-14cb4746493f\" (UID: \"cb118591-76e0-4ea1-8add-14cb4746493f\") " Apr 16 21:01:48.959474 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:48.959451 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cb118591-76e0-4ea1-8add-14cb4746493f" (UID: "cb118591-76e0-4ea1-8add-14cb4746493f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:01:49.060063 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.059992 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cb118591-76e0-4ea1-8add-14cb4746493f-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:01:49.102000 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.101966 2563 generic.go:358] "Generic (PLEG): container finished" podID="cb118591-76e0-4ea1-8add-14cb4746493f" containerID="5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781" exitCode=0 Apr 16 21:01:49.102135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.102040 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" Apr 16 21:01:49.102135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.102052 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerDied","Data":"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781"} Apr 16 21:01:49.102135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.102091 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl" event={"ID":"cb118591-76e0-4ea1-8add-14cb4746493f","Type":"ContainerDied","Data":"8b78dfa6bc0f4e3b3ed5bded168cafd023b1be0881a486e2460d4bb309180fbd"} Apr 16 21:01:49.102135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.102112 2563 scope.go:117] "RemoveContainer" containerID="5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781" Apr 16 21:01:49.110111 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.110086 2563 scope.go:117] "RemoveContainer" containerID="4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83" Apr 16 21:01:49.116752 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.116733 2563 scope.go:117] "RemoveContainer" containerID="5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781" Apr 16 21:01:49.117022 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:01:49.117003 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781\": container with ID starting with 5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781 not found: ID does not exist" containerID="5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781" Apr 16 21:01:49.117114 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.117037 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781"} err="failed to get container status \"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781\": rpc error: code = NotFound desc = could not find container \"5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781\": container with ID starting with 5a853265b3fbf5bc2dce08f4098e914665445185bd1c7c663686c3b6a6f78781 not found: ID does not exist" Apr 16 21:01:49.117114 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.117060 2563 scope.go:117] "RemoveContainer" containerID="4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83" Apr 16 21:01:49.117317 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:01:49.117300 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83\": container with ID starting with 4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83 not found: ID does not exist" containerID="4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83" Apr 16 21:01:49.117360 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.117322 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83"} err="failed to get container status \"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83\": rpc error: code = NotFound desc = could not find container \"4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83\": container with ID starting with 4337bfb76c2d9d035d74e3aa93176c9bd7f25854512754650b3926b11f50dd83 not found: ID does not exist" Apr 16 21:01:49.130298 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.130273 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:01:49.136859 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.136836 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-2gxjl"] Apr 16 21:01:49.298396 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:49.298362 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" path="/var/lib/kubelet/pods/cb118591-76e0-4ea1-8add-14cb4746493f/volumes" Apr 16 21:01:50.107066 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:50.107034 2563 generic.go:358] "Generic (PLEG): container finished" podID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerID="cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc" exitCode=0 Apr 16 21:01:50.107443 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:50.107115 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerDied","Data":"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc"} Apr 16 21:01:51.112735 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:51.112699 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerStarted","Data":"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1"} Apr 16 21:01:51.113136 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:01:51.112920 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:02:22.128894 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:22.128865 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:02:22.152041 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:22.151992 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" podStartSLOduration=37.151980103 podStartE2EDuration="37.151980103s" podCreationTimestamp="2026-04-16 21:01:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:01:51.155685605 +0000 UTC m=+3008.444611389" watchObservedRunningTime="2026-04-16 21:02:22.151980103 +0000 UTC m=+3039.440905888" Apr 16 21:02:25.697158 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.697124 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:02:25.697518 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.697389 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="kserve-container" containerID="cri-o://c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1" gracePeriod=30 Apr 16 21:02:25.757483 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757450 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:02:25.757787 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757775 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" Apr 16 21:02:25.757833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757791 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" Apr 16 21:02:25.757833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757826 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="storage-initializer" Apr 16 21:02:25.757833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757833 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="storage-initializer" Apr 16 21:02:25.757936 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.757886 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb118591-76e0-4ea1-8add-14cb4746493f" containerName="kserve-container" Apr 16 21:02:25.760841 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.760824 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:25.772768 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.772744 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:02:25.938579 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:25.938526 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-glfxp\" (UID: \"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:26.039933 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:26.039857 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-glfxp\" (UID: \"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:26.040207 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:26.040189 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-glfxp\" (UID: \"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:26.071007 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:26.070983 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:26.189018 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:26.188947 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:02:26.191196 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:02:26.191170 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ba00fc_4a19_4e15_9c3b_a3086ec75e5c.slice/crio-16a85b4f849ceed01692037e81b9a9d64aa48b6ddf8f7b114c22dcdc41abbe6b WatchSource:0}: Error finding container 16a85b4f849ceed01692037e81b9a9d64aa48b6ddf8f7b114c22dcdc41abbe6b: Status 404 returned error can't find the container with id 16a85b4f849ceed01692037e81b9a9d64aa48b6ddf8f7b114c22dcdc41abbe6b Apr 16 21:02:26.227180 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:26.227151 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerStarted","Data":"16a85b4f849ceed01692037e81b9a9d64aa48b6ddf8f7b114c22dcdc41abbe6b"} Apr 16 21:02:27.232183 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:27.232145 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerStarted","Data":"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5"} Apr 16 21:02:30.243194 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:30.243162 2563 generic.go:358] "Generic (PLEG): container finished" podID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerID="4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5" exitCode=0 Apr 16 21:02:30.243590 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:30.243235 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerDied","Data":"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5"} Apr 16 21:02:31.247831 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.247796 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerStarted","Data":"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352"} Apr 16 21:02:31.248243 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.248017 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:02:31.266356 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.266313 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" podStartSLOduration=6.266298864 podStartE2EDuration="6.266298864s" podCreationTimestamp="2026-04-16 21:02:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:02:31.265821257 +0000 UTC m=+3048.554747043" watchObservedRunningTime="2026-04-16 21:02:31.266298864 +0000 UTC m=+3048.555224652" Apr 16 21:02:31.736288 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.736264 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:02:31.787882 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.787822 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location\") pod \"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc\" (UID: \"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc\") " Apr 16 21:02:31.788147 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.788122 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" (UID: "4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:02:31.888923 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:31.888895 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:02:32.252075 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.252040 2563 generic.go:358] "Generic (PLEG): container finished" podID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerID="c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1" exitCode=0 Apr 16 21:02:32.252522 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.252107 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" Apr 16 21:02:32.252522 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.252122 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerDied","Data":"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1"} Apr 16 21:02:32.252522 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.252156 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94" event={"ID":"4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc","Type":"ContainerDied","Data":"0e3fa23c85d9e68632d81c954fc09eb805b97704f0f225054717ef9dc8721f0d"} Apr 16 21:02:32.252522 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.252171 2563 scope.go:117] "RemoveContainer" containerID="c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1" Apr 16 21:02:32.260058 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.260043 2563 scope.go:117] "RemoveContainer" containerID="cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc" Apr 16 21:02:32.266909 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.266892 2563 scope.go:117] "RemoveContainer" containerID="c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1" Apr 16 21:02:32.267143 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:02:32.267124 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1\": container with ID starting with c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1 not found: ID does not exist" containerID="c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1" Apr 16 21:02:32.267193 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.267151 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1"} err="failed to get container status \"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1\": rpc error: code = NotFound desc = could not find container \"c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1\": container with ID starting with c8b7c629da85d99da10eea18d6f6c24da5835b7f6ff67707c34e3980fba096e1 not found: ID does not exist" Apr 16 21:02:32.267193 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.267168 2563 scope.go:117] "RemoveContainer" containerID="cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc" Apr 16 21:02:32.267390 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:02:32.267373 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc\": container with ID starting with cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc not found: ID does not exist" containerID="cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc" Apr 16 21:02:32.267433 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.267396 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc"} err="failed to get container status \"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc\": rpc error: code = NotFound desc = could not find container \"cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc\": container with ID starting with cd3c3e416b72e216115e4e0a2e9f25b72ed0254c57ac7d5ebd224ecf2367e5dc not found: ID does not exist" Apr 16 21:02:32.274770 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.274749 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:02:32.279918 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:32.279899 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-ntp94"] Apr 16 21:02:33.298596 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:02:33.298546 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" path="/var/lib/kubelet/pods/4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc/volumes" Apr 16 21:03:02.329197 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:02.329160 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:03:05.838664 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.838632 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:03:05.839104 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.838879 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="kserve-container" containerID="cri-o://cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352" gracePeriod=30 Apr 16 21:03:05.977950 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.977918 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:03:05.978228 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.978215 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="kserve-container" Apr 16 21:03:05.978291 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.978229 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="kserve-container" Apr 16 21:03:05.978291 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.978253 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="storage-initializer" Apr 16 21:03:05.978291 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.978258 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="storage-initializer" Apr 16 21:03:05.978455 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.978306 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="4d4be8b7-26e4-4c95-9d57-9cbc5c57e4fc" containerName="kserve-container" Apr 16 21:03:05.981053 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.981035 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:05.993756 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:05.993729 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:03:06.040723 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:06.040692 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-k5pwb\" (UID: \"bd6ed8b0-01d0-4602-8a0a-f799772e3116\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:06.141605 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:06.141512 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-k5pwb\" (UID: \"bd6ed8b0-01d0-4602-8a0a-f799772e3116\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:06.141869 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:06.141850 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-k5pwb\" (UID: \"bd6ed8b0-01d0-4602-8a0a-f799772e3116\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:06.295567 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:06.295530 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:06.415595 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:06.415508 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:03:06.419111 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:03:06.419085 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd6ed8b0_01d0_4602_8a0a_f799772e3116.slice/crio-3c0f6ecf9ebbcbd4ddf6383ea4d8a75fc3eabf45d31f3509b895bfb7b31381fe WatchSource:0}: Error finding container 3c0f6ecf9ebbcbd4ddf6383ea4d8a75fc3eabf45d31f3509b895bfb7b31381fe: Status 404 returned error can't find the container with id 3c0f6ecf9ebbcbd4ddf6383ea4d8a75fc3eabf45d31f3509b895bfb7b31381fe Apr 16 21:03:07.361527 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:07.361492 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerStarted","Data":"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a"} Apr 16 21:03:07.361527 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:07.361526 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerStarted","Data":"3c0f6ecf9ebbcbd4ddf6383ea4d8a75fc3eabf45d31f3509b895bfb7b31381fe"} Apr 16 21:03:11.268075 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.268052 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:03:11.374401 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.374371 2563 generic.go:358] "Generic (PLEG): container finished" podID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerID="1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a" exitCode=0 Apr 16 21:03:11.374516 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.374443 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerDied","Data":"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a"} Apr 16 21:03:11.375831 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.375808 2563 generic.go:358] "Generic (PLEG): container finished" podID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerID="cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352" exitCode=0 Apr 16 21:03:11.375918 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.375858 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerDied","Data":"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352"} Apr 16 21:03:11.375918 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.375865 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" Apr 16 21:03:11.375918 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.375886 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp" event={"ID":"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c","Type":"ContainerDied","Data":"16a85b4f849ceed01692037e81b9a9d64aa48b6ddf8f7b114c22dcdc41abbe6b"} Apr 16 21:03:11.375918 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.375906 2563 scope.go:117] "RemoveContainer" containerID="cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352" Apr 16 21:03:11.383323 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.383307 2563 scope.go:117] "RemoveContainer" containerID="4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5" Apr 16 21:03:11.388305 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.388283 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location\") pod \"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c\" (UID: \"c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c\") " Apr 16 21:03:11.388621 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.388600 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" (UID: "c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:03:11.389896 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.389878 2563 scope.go:117] "RemoveContainer" containerID="cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352" Apr 16 21:03:11.390123 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:03:11.390104 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352\": container with ID starting with cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352 not found: ID does not exist" containerID="cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352" Apr 16 21:03:11.390184 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.390130 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352"} err="failed to get container status \"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352\": rpc error: code = NotFound desc = could not find container \"cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352\": container with ID starting with cd8bd6f75aed511ff0080a5b07d3366cfd67d18a18217c24d4fd52af0b1a6352 not found: ID does not exist" Apr 16 21:03:11.390184 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.390146 2563 scope.go:117] "RemoveContainer" containerID="4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5" Apr 16 21:03:11.390323 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:03:11.390308 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5\": container with ID starting with 4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5 not found: ID does not exist" containerID="4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5" Apr 16 21:03:11.390377 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.390326 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5"} err="failed to get container status \"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5\": rpc error: code = NotFound desc = could not find container \"4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5\": container with ID starting with 4f9e0c06c653400fcbe5ae08e326be869d99ac20a75fab4f7e297235fbece5a5 not found: ID does not exist" Apr 16 21:03:11.489046 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.489025 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:03:11.697023 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.696995 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:03:11.706942 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:11.706922 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-glfxp"] Apr 16 21:03:12.381626 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:12.381588 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerStarted","Data":"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e"} Apr 16 21:03:12.382081 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:12.381899 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:03:12.382814 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:12.382794 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:03:12.398808 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:12.398765 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podStartSLOduration=7.3987538090000005 podStartE2EDuration="7.398753809s" podCreationTimestamp="2026-04-16 21:03:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:03:12.397632232 +0000 UTC m=+3089.686558017" watchObservedRunningTime="2026-04-16 21:03:12.398753809 +0000 UTC m=+3089.687679592" Apr 16 21:03:13.298950 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:13.298919 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" path="/var/lib/kubelet/pods/c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c/volumes" Apr 16 21:03:13.384814 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:13.384781 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:03:23.385645 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:23.385603 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:03:33.385489 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:33.385450 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:03:43.385384 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:43.385346 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:03:53.385696 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:03:53.385652 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:04:03.384868 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:03.384782 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 21:04:13.386537 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:13.386497 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:04:16.014195 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.014158 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:04:16.014701 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.014495 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" containerID="cri-o://32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e" gracePeriod=30 Apr 16 21:04:16.077588 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.077531 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:04:16.077947 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.077928 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="kserve-container" Apr 16 21:04:16.078033 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.077950 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="kserve-container" Apr 16 21:04:16.078033 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.077963 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="storage-initializer" Apr 16 21:04:16.078033 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.077971 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="storage-initializer" Apr 16 21:04:16.078196 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.078055 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="c6ba00fc-4a19-4e15-9c3b-a3086ec75e5c" containerName="kserve-container" Apr 16 21:04:16.082207 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.082189 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:16.087310 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.087289 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:04:16.180977 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.180950 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h\" (UID: \"ca8865ea-c625-4be4-b52c-40080c50e2dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:16.281813 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.281739 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h\" (UID: \"ca8865ea-c625-4be4-b52c-40080c50e2dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:16.282059 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.282042 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h\" (UID: \"ca8865ea-c625-4be4-b52c-40080c50e2dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:16.392548 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.392524 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:16.508986 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.508958 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:04:16.512003 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:04:16.511975 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8865ea_c625_4be4_b52c_40080c50e2dc.slice/crio-cb3bd775a7445045bb12c4dd0437fdf16495120b232d1408b85c4c1e34c5c49a WatchSource:0}: Error finding container cb3bd775a7445045bb12c4dd0437fdf16495120b232d1408b85c4c1e34c5c49a: Status 404 returned error can't find the container with id cb3bd775a7445045bb12c4dd0437fdf16495120b232d1408b85c4c1e34c5c49a Apr 16 21:04:16.589723 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.589698 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerStarted","Data":"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4"} Apr 16 21:04:16.589825 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:16.589734 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerStarted","Data":"cb3bd775a7445045bb12c4dd0437fdf16495120b232d1408b85c4c1e34c5c49a"} Apr 16 21:04:19.352228 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.352205 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:04:19.405353 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.405288 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location\") pod \"bd6ed8b0-01d0-4602-8a0a-f799772e3116\" (UID: \"bd6ed8b0-01d0-4602-8a0a-f799772e3116\") " Apr 16 21:04:19.405614 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.405593 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bd6ed8b0-01d0-4602-8a0a-f799772e3116" (UID: "bd6ed8b0-01d0-4602-8a0a-f799772e3116"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:04:19.505890 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.505866 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bd6ed8b0-01d0-4602-8a0a-f799772e3116-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:04:19.600107 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.600073 2563 generic.go:358] "Generic (PLEG): container finished" podID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerID="32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e" exitCode=0 Apr 16 21:04:19.600225 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.600151 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerDied","Data":"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e"} Apr 16 21:04:19.600225 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.600174 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" Apr 16 21:04:19.600225 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.600188 2563 scope.go:117] "RemoveContainer" containerID="32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e" Apr 16 21:04:19.600338 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.600178 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb" event={"ID":"bd6ed8b0-01d0-4602-8a0a-f799772e3116","Type":"ContainerDied","Data":"3c0f6ecf9ebbcbd4ddf6383ea4d8a75fc3eabf45d31f3509b895bfb7b31381fe"} Apr 16 21:04:19.608186 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.608160 2563 scope.go:117] "RemoveContainer" containerID="1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a" Apr 16 21:04:19.614842 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.614822 2563 scope.go:117] "RemoveContainer" containerID="32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e" Apr 16 21:04:19.615067 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:04:19.615049 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e\": container with ID starting with 32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e not found: ID does not exist" containerID="32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e" Apr 16 21:04:19.615110 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.615074 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e"} err="failed to get container status \"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e\": rpc error: code = NotFound desc = could not find container \"32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e\": container with ID starting with 32a1c015ea51bcf8d6d368a0360a65d0add88310f0df89479beaf35cb0eb6f9e not found: ID does not exist" Apr 16 21:04:19.615110 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.615091 2563 scope.go:117] "RemoveContainer" containerID="1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a" Apr 16 21:04:19.615291 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:04:19.615276 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a\": container with ID starting with 1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a not found: ID does not exist" containerID="1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a" Apr 16 21:04:19.615340 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.615294 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a"} err="failed to get container status \"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a\": rpc error: code = NotFound desc = could not find container \"1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a\": container with ID starting with 1f1585330bd1f43d36fdd6ac256c787d050ed2333a454512bc03bd4eb6a1647a not found: ID does not exist" Apr 16 21:04:19.621334 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.621314 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:04:19.624735 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:19.624712 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-k5pwb"] Apr 16 21:04:20.604732 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:20.604649 2563 generic.go:358] "Generic (PLEG): container finished" podID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerID="6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4" exitCode=0 Apr 16 21:04:20.605126 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:20.604722 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerDied","Data":"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4"} Apr 16 21:04:21.298662 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:21.298629 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" path="/var/lib/kubelet/pods/bd6ed8b0-01d0-4602-8a0a-f799772e3116/volumes" Apr 16 21:04:21.610587 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:21.610492 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerStarted","Data":"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318"} Apr 16 21:04:21.611029 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:21.610736 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:04:21.640678 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:21.640636 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" podStartSLOduration=5.6406246509999995 podStartE2EDuration="5.640624651s" podCreationTimestamp="2026-04-16 21:04:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:04:21.639240474 +0000 UTC m=+3158.928166258" watchObservedRunningTime="2026-04-16 21:04:21.640624651 +0000 UTC m=+3158.929550438" Apr 16 21:04:52.629223 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:04:52.629169 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 21:05:02.615779 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:02.615750 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:05:06.199847 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.199790 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:05:06.200280 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.200114 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" containerID="cri-o://ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318" gracePeriod=30 Apr 16 21:05:06.259297 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259262 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:05:06.259641 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259616 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="storage-initializer" Apr 16 21:05:06.259641 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259643 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="storage-initializer" Apr 16 21:05:06.259641 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259651 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" Apr 16 21:05:06.259785 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259657 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" Apr 16 21:05:06.259785 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.259704 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="bd6ed8b0-01d0-4602-8a0a-f799772e3116" containerName="kserve-container" Apr 16 21:05:06.262758 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.262742 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:06.272526 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.272500 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:05:06.339967 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.339929 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-jjqbj\" (UID: \"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:06.441012 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.440980 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-jjqbj\" (UID: \"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:06.441291 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.441276 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-jjqbj\" (UID: \"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:06.574335 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.574264 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:06.688686 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.688663 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:05:06.690316 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:05:06.690289 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1cf3941_5f63_448c_9e10_6a1c8bc6b08c.slice/crio-2248753ad24d290145ecdf623bc8e4cdb04599ffbbde638c688907f987ee4df3 WatchSource:0}: Error finding container 2248753ad24d290145ecdf623bc8e4cdb04599ffbbde638c688907f987ee4df3: Status 404 returned error can't find the container with id 2248753ad24d290145ecdf623bc8e4cdb04599ffbbde638c688907f987ee4df3 Apr 16 21:05:06.766249 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.766215 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerStarted","Data":"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda"} Apr 16 21:05:06.766355 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:06.766259 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerStarted","Data":"2248753ad24d290145ecdf623bc8e4cdb04599ffbbde638c688907f987ee4df3"} Apr 16 21:05:10.780760 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:10.780725 2563 generic.go:358] "Generic (PLEG): container finished" podID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerID="141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda" exitCode=0 Apr 16 21:05:10.781129 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:10.780802 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerDied","Data":"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda"} Apr 16 21:05:11.785733 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:11.785698 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerStarted","Data":"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612"} Apr 16 21:05:11.786143 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:11.786080 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:05:11.787215 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:11.787187 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:05:11.802195 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:11.802160 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podStartSLOduration=5.802149732 podStartE2EDuration="5.802149732s" podCreationTimestamp="2026-04-16 21:05:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:05:11.800926909 +0000 UTC m=+3209.089852694" watchObservedRunningTime="2026-04-16 21:05:11.802149732 +0000 UTC m=+3209.091075517" Apr 16 21:05:12.614330 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:12.614289 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.65:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 21:05:12.789271 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:12.789235 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:05:13.236856 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.236828 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:05:13.296706 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.296637 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location\") pod \"ca8865ea-c625-4be4-b52c-40080c50e2dc\" (UID: \"ca8865ea-c625-4be4-b52c-40080c50e2dc\") " Apr 16 21:05:13.296929 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.296909 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca8865ea-c625-4be4-b52c-40080c50e2dc" (UID: "ca8865ea-c625-4be4-b52c-40080c50e2dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:05:13.397925 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.397904 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8865ea-c625-4be4-b52c-40080c50e2dc-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:05:13.793645 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.793612 2563 generic.go:358] "Generic (PLEG): container finished" podID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerID="ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318" exitCode=0 Apr 16 21:05:13.794065 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.793673 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerDied","Data":"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318"} Apr 16 21:05:13.794065 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.793681 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" Apr 16 21:05:13.794065 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.793700 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h" event={"ID":"ca8865ea-c625-4be4-b52c-40080c50e2dc","Type":"ContainerDied","Data":"cb3bd775a7445045bb12c4dd0437fdf16495120b232d1408b85c4c1e34c5c49a"} Apr 16 21:05:13.794065 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.793715 2563 scope.go:117] "RemoveContainer" containerID="ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318" Apr 16 21:05:13.801410 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.801391 2563 scope.go:117] "RemoveContainer" containerID="6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4" Apr 16 21:05:13.808513 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.808485 2563 scope.go:117] "RemoveContainer" containerID="ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318" Apr 16 21:05:13.810053 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:05:13.810031 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318\": container with ID starting with ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318 not found: ID does not exist" containerID="ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318" Apr 16 21:05:13.810156 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.810065 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318"} err="failed to get container status \"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318\": rpc error: code = NotFound desc = could not find container \"ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318\": container with ID starting with ba45855c2582dd5927511243cdbb0025a537bd35b379cd89a5ba1de80cd41318 not found: ID does not exist" Apr 16 21:05:13.810156 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.810087 2563 scope.go:117] "RemoveContainer" containerID="6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4" Apr 16 21:05:13.810434 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:05:13.810412 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4\": container with ID starting with 6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4 not found: ID does not exist" containerID="6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4" Apr 16 21:05:13.810499 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.810432 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:05:13.810499 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.810441 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4"} err="failed to get container status \"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4\": rpc error: code = NotFound desc = could not find container \"6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4\": container with ID starting with 6ebcc587ee0278a42cf8fd9a1c152a727807253b76a76b9b23e724bce404aca4 not found: ID does not exist" Apr 16 21:05:13.815833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:13.815813 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-tbb5h"] Apr 16 21:05:15.298721 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:15.298687 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" path="/var/lib/kubelet/pods/ca8865ea-c625-4be4-b52c-40080c50e2dc/volumes" Apr 16 21:05:22.790027 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:22.789984 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:05:32.789634 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:32.789520 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:05:42.789947 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:42.789907 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:05:52.789821 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:05:52.789782 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:06:02.790263 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:02.790213 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.66:8080: connect: connection refused" Apr 16 21:06:12.790218 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:12.790182 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:06:16.392184 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.392151 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:06:16.392925 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.392411 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" containerID="cri-o://7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612" gracePeriod=30 Apr 16 21:06:16.446475 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446450 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:06:16.446855 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446840 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="storage-initializer" Apr 16 21:06:16.446919 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446859 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="storage-initializer" Apr 16 21:06:16.446919 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446877 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" Apr 16 21:06:16.446919 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446883 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" Apr 16 21:06:16.447078 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.446951 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca8865ea-c625-4be4-b52c-40080c50e2dc" containerName="kserve-container" Apr 16 21:06:16.451140 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.451124 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:16.453621 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.453601 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 21:06:16.457354 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.457329 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:06:16.482683 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.482662 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-d4qlb\" (UID: \"2248192f-267b-42f2-8c24-832c8b6f21b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:16.583932 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.583912 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-d4qlb\" (UID: \"2248192f-267b-42f2-8c24-832c8b6f21b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:16.584205 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.584189 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-988f7fbdb-d4qlb\" (UID: \"2248192f-267b-42f2-8c24-832c8b6f21b1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:16.762298 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.762275 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:16.881776 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.881747 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:06:16.884480 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:06:16.884450 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2248192f_267b_42f2_8c24_832c8b6f21b1.slice/crio-45b9e1ed23c8a1916089eef707e88ebf6f26c099c6d70cc60a25c578044d4903 WatchSource:0}: Error finding container 45b9e1ed23c8a1916089eef707e88ebf6f26c099c6d70cc60a25c578044d4903: Status 404 returned error can't find the container with id 45b9e1ed23c8a1916089eef707e88ebf6f26c099c6d70cc60a25c578044d4903 Apr 16 21:06:16.996149 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.996120 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerStarted","Data":"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685"} Apr 16 21:06:16.996283 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:16.996157 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerStarted","Data":"45b9e1ed23c8a1916089eef707e88ebf6f26c099c6d70cc60a25c578044d4903"} Apr 16 21:06:18.000389 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:18.000357 2563 generic.go:358] "Generic (PLEG): container finished" podID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerID="e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685" exitCode=0 Apr 16 21:06:18.000729 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:18.000436 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerDied","Data":"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685"} Apr 16 21:06:19.005642 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.005605 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerStarted","Data":"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c"} Apr 16 21:06:19.006037 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.005767 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:06:19.007113 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.007085 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:06:19.025489 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.025448 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podStartSLOduration=3.025434752 podStartE2EDuration="3.025434752s" podCreationTimestamp="2026-04-16 21:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:06:19.023910427 +0000 UTC m=+3276.312836213" watchObservedRunningTime="2026-04-16 21:06:19.025434752 +0000 UTC m=+3276.314360537" Apr 16 21:06:19.827235 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.827214 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:06:19.910446 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.910383 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location\") pod \"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c\" (UID: \"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c\") " Apr 16 21:06:19.910692 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:19.910672 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" (UID: "b1cf3941-5f63-448c-9e10-6a1c8bc6b08c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:06:20.010399 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010370 2563 generic.go:358] "Generic (PLEG): container finished" podID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerID="7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612" exitCode=0 Apr 16 21:06:20.010816 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010453 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" Apr 16 21:06:20.010816 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010452 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerDied","Data":"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612"} Apr 16 21:06:20.010816 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010607 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj" event={"ID":"b1cf3941-5f63-448c-9e10-6a1c8bc6b08c","Type":"ContainerDied","Data":"2248753ad24d290145ecdf623bc8e4cdb04599ffbbde638c688907f987ee4df3"} Apr 16 21:06:20.010816 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010626 2563 scope.go:117] "RemoveContainer" containerID="7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612" Apr 16 21:06:20.011053 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.010969 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:06:20.011053 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.011019 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:06:20.018833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.018815 2563 scope.go:117] "RemoveContainer" containerID="141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda" Apr 16 21:06:20.025766 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.025750 2563 scope.go:117] "RemoveContainer" containerID="7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612" Apr 16 21:06:20.026004 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:06:20.025988 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612\": container with ID starting with 7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612 not found: ID does not exist" containerID="7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612" Apr 16 21:06:20.026051 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.026012 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612"} err="failed to get container status \"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612\": rpc error: code = NotFound desc = could not find container \"7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612\": container with ID starting with 7839066bc4c64755fecb1b6cf9bd93760ddc8a35a2f703a1f7cfb3c3b8377612 not found: ID does not exist" Apr 16 21:06:20.026051 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.026028 2563 scope.go:117] "RemoveContainer" containerID="141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda" Apr 16 21:06:20.026252 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:06:20.026234 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda\": container with ID starting with 141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda not found: ID does not exist" containerID="141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda" Apr 16 21:06:20.026298 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.026258 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda"} err="failed to get container status \"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda\": rpc error: code = NotFound desc = could not find container \"141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda\": container with ID starting with 141c70a19f73a9b0eda99721e6cdf415ba88a9e7e4e42580e57806d33f31cdda not found: ID does not exist" Apr 16 21:06:20.031773 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.031753 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:06:20.034690 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:20.034670 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-jjqbj"] Apr 16 21:06:21.298640 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:21.298606 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" path="/var/lib/kubelet/pods/b1cf3941-5f63-448c-9e10-6a1c8bc6b08c/volumes" Apr 16 21:06:30.011721 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:30.011683 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:06:40.011515 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:40.011474 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:06:50.011717 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:06:50.011677 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:07:00.011036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:00.010991 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:07:10.011428 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:10.011343 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:07:20.012317 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:20.012287 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:07:26.551397 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.551367 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:07:26.551767 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.551682 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" containerID="cri-o://73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c" gracePeriod=30 Apr 16 21:07:26.655396 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655365 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:07:26.655771 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655754 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" Apr 16 21:07:26.655771 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655773 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" Apr 16 21:07:26.655869 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655784 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="storage-initializer" Apr 16 21:07:26.655869 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655789 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="storage-initializer" Apr 16 21:07:26.655869 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.655840 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1cf3941-5f63-448c-9e10-6a1c8bc6b08c" containerName="kserve-container" Apr 16 21:07:26.658674 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.658656 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.661471 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.661451 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 21:07:26.667512 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.667489 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:07:26.695733 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.695708 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.695823 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.695782 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.796380 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.796354 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.796499 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.796395 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.796787 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.796771 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.796959 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.796943 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:26.970185 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:26.970162 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:27.084940 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:27.084914 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:07:27.087394 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:07:27.087363 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0bbad4e_b9fe_4058_97f8_e630d415ee5e.slice/crio-a600917d5be49e08d06f0a434536a54f91cb3fe0f1b2f7de6a441b2e9897f8af WatchSource:0}: Error finding container a600917d5be49e08d06f0a434536a54f91cb3fe0f1b2f7de6a441b2e9897f8af: Status 404 returned error can't find the container with id a600917d5be49e08d06f0a434536a54f91cb3fe0f1b2f7de6a441b2e9897f8af Apr 16 21:07:27.089186 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:27.089171 2563 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 21:07:27.219923 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:27.219897 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerStarted","Data":"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1"} Apr 16 21:07:27.220049 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:27.219933 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerStarted","Data":"a600917d5be49e08d06f0a434536a54f91cb3fe0f1b2f7de6a441b2e9897f8af"} Apr 16 21:07:28.224455 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:28.224420 2563 generic.go:358] "Generic (PLEG): container finished" podID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerID="584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1" exitCode=0 Apr 16 21:07:28.224839 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:28.224503 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerDied","Data":"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1"} Apr 16 21:07:29.229012 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:29.228977 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerStarted","Data":"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710"} Apr 16 21:07:29.229467 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:29.229235 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:07:29.230940 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:29.230907 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:07:29.246768 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:29.246722 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podStartSLOduration=3.246709391 podStartE2EDuration="3.246709391s" podCreationTimestamp="2026-04-16 21:07:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:07:29.245841539 +0000 UTC m=+3346.534767324" watchObservedRunningTime="2026-04-16 21:07:29.246709391 +0000 UTC m=+3346.535635166" Apr 16 21:07:30.011123 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.011085 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 21:07:30.233789 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.233746 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:07:30.487833 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.487803 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:07:30.524920 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.524897 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location\") pod \"2248192f-267b-42f2-8c24-832c8b6f21b1\" (UID: \"2248192f-267b-42f2-8c24-832c8b6f21b1\") " Apr 16 21:07:30.525154 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.525133 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2248192f-267b-42f2-8c24-832c8b6f21b1" (UID: "2248192f-267b-42f2-8c24-832c8b6f21b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:07:30.626162 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:30.626109 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2248192f-267b-42f2-8c24-832c8b6f21b1-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:07:31.237712 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.237682 2563 generic.go:358] "Generic (PLEG): container finished" podID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerID="73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c" exitCode=0 Apr 16 21:07:31.238194 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.237748 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" Apr 16 21:07:31.238194 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.237771 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerDied","Data":"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c"} Apr 16 21:07:31.238194 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.237806 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb" event={"ID":"2248192f-267b-42f2-8c24-832c8b6f21b1","Type":"ContainerDied","Data":"45b9e1ed23c8a1916089eef707e88ebf6f26c099c6d70cc60a25c578044d4903"} Apr 16 21:07:31.238194 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.237820 2563 scope.go:117] "RemoveContainer" containerID="73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c" Apr 16 21:07:31.252135 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.252115 2563 scope.go:117] "RemoveContainer" containerID="e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685" Apr 16 21:07:31.258901 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.258885 2563 scope.go:117] "RemoveContainer" containerID="73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c" Apr 16 21:07:31.259143 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:07:31.259122 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c\": container with ID starting with 73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c not found: ID does not exist" containerID="73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c" Apr 16 21:07:31.259210 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.259155 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c"} err="failed to get container status \"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c\": rpc error: code = NotFound desc = could not find container \"73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c\": container with ID starting with 73a6a3a1a84872a294a642af9d88f5789210f24ad3847e4fb732d30355551a7c not found: ID does not exist" Apr 16 21:07:31.259210 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.259179 2563 scope.go:117] "RemoveContainer" containerID="e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685" Apr 16 21:07:31.259416 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:07:31.259397 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685\": container with ID starting with e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685 not found: ID does not exist" containerID="e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685" Apr 16 21:07:31.259455 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.259423 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685"} err="failed to get container status \"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685\": rpc error: code = NotFound desc = could not find container \"e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685\": container with ID starting with e570ce346fdfed0de6203754471e6ca7f3fc3e7e45426601c43c506579939685 not found: ID does not exist" Apr 16 21:07:31.262619 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.262598 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:07:31.267973 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.267955 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-988f7fbdb-d4qlb"] Apr 16 21:07:31.298356 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:31.298334 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" path="/var/lib/kubelet/pods/2248192f-267b-42f2-8c24-832c8b6f21b1/volumes" Apr 16 21:07:40.234705 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:40.234666 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:07:50.234098 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:07:50.234046 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:08:00.234430 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:00.234386 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:08:10.233714 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:10.233671 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:08:20.234610 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:20.234547 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:08:30.234777 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:30.234739 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:08:36.722664 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:36.722595 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:08:36.723082 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:36.722876 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" containerID="cri-o://f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710" gracePeriod=30 Apr 16 21:08:37.791671 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.791635 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:37.792036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.791941 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" Apr 16 21:08:37.792036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.791951 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" Apr 16 21:08:37.792036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.791967 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="storage-initializer" Apr 16 21:08:37.792036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.791974 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="storage-initializer" Apr 16 21:08:37.792036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.792022 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="2248192f-267b-42f2-8c24-832c8b6f21b1" containerName="kserve-container" Apr 16 21:08:37.795036 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.795017 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:37.805162 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.805140 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:37.926826 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:37.926798 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw\" (UID: \"2c4fa061-72b4-4bff-aebf-2b21e34272c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:38.027854 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.027824 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw\" (UID: \"2c4fa061-72b4-4bff-aebf-2b21e34272c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:38.028181 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.028159 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw\" (UID: \"2c4fa061-72b4-4bff-aebf-2b21e34272c9\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:38.106305 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.106237 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:38.227227 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.227202 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:38.229897 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:08:38.229869 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c4fa061_72b4_4bff_aebf_2b21e34272c9.slice/crio-6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5 WatchSource:0}: Error finding container 6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5: Status 404 returned error can't find the container with id 6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5 Apr 16 21:08:38.442310 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.442279 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerStarted","Data":"eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122"} Apr 16 21:08:38.442484 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:38.442318 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerStarted","Data":"6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5"} Apr 16 21:08:40.234307 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.234271 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.68:8080: connect: connection refused" Apr 16 21:08:40.451188 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.451167 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/0.log" Apr 16 21:08:40.451311 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.451202 2563 generic.go:358] "Generic (PLEG): container finished" podID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerID="eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122" exitCode=1 Apr 16 21:08:40.451311 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.451276 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerDied","Data":"eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122"} Apr 16 21:08:40.670699 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.670676 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:08:40.751405 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.751376 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert\") pod \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " Apr 16 21:08:40.751589 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.751454 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location\") pod \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\" (UID: \"d0bbad4e-b9fe-4058-97f8-e630d415ee5e\") " Apr 16 21:08:40.751769 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.751746 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "d0bbad4e-b9fe-4058-97f8-e630d415ee5e" (UID: "d0bbad4e-b9fe-4058-97f8-e630d415ee5e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:08:40.751816 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.751751 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d0bbad4e-b9fe-4058-97f8-e630d415ee5e" (UID: "d0bbad4e-b9fe-4058-97f8-e630d415ee5e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:08:40.852124 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.852057 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:08:40.852124 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:40.852082 2563 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/d0bbad4e-b9fe-4058-97f8-e630d415ee5e-cabundle-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:08:41.455889 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.455864 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/0.log" Apr 16 21:08:41.456237 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.455969 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerStarted","Data":"cdc8aef1860c5a720754c9db9b94f3a8dd8df902830a039cbafd1c7441be119e"} Apr 16 21:08:41.457355 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.457332 2563 generic.go:358] "Generic (PLEG): container finished" podID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerID="f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710" exitCode=0 Apr 16 21:08:41.457473 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.457402 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" Apr 16 21:08:41.457540 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.457402 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerDied","Data":"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710"} Apr 16 21:08:41.457540 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.457503 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw" event={"ID":"d0bbad4e-b9fe-4058-97f8-e630d415ee5e","Type":"ContainerDied","Data":"a600917d5be49e08d06f0a434536a54f91cb3fe0f1b2f7de6a441b2e9897f8af"} Apr 16 21:08:41.457540 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.457519 2563 scope.go:117] "RemoveContainer" containerID="f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710" Apr 16 21:08:41.466120 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.466088 2563 scope.go:117] "RemoveContainer" containerID="584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1" Apr 16 21:08:41.472923 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.472904 2563 scope.go:117] "RemoveContainer" containerID="f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710" Apr 16 21:08:41.473173 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:08:41.473152 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710\": container with ID starting with f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710 not found: ID does not exist" containerID="f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710" Apr 16 21:08:41.473216 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.473180 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710"} err="failed to get container status \"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710\": rpc error: code = NotFound desc = could not find container \"f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710\": container with ID starting with f469ebee5d5bdf48384fc7025607026aff6b563469a3b7d9dbd90a18f9b01710 not found: ID does not exist" Apr 16 21:08:41.473216 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.473199 2563 scope.go:117] "RemoveContainer" containerID="584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1" Apr 16 21:08:41.473423 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:08:41.473404 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1\": container with ID starting with 584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1 not found: ID does not exist" containerID="584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1" Apr 16 21:08:41.473479 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.473428 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1"} err="failed to get container status \"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1\": rpc error: code = NotFound desc = could not find container \"584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1\": container with ID starting with 584e7626346c7c4d5b3dd581599b453dfb81889e8ad4d162269b8cff848155f1 not found: ID does not exist" Apr 16 21:08:41.490283 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.490261 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:08:41.496888 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:41.496867 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5f59498c54-fgkhw"] Apr 16 21:08:43.298806 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:43.298766 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" path="/var/lib/kubelet/pods/d0bbad4e-b9fe-4058-97f8-e630d415ee5e/volumes" Apr 16 21:08:46.479599 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.479551 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/1.log" Apr 16 21:08:46.479972 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.479913 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/0.log" Apr 16 21:08:46.479972 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.479943 2563 generic.go:358] "Generic (PLEG): container finished" podID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerID="cdc8aef1860c5a720754c9db9b94f3a8dd8df902830a039cbafd1c7441be119e" exitCode=1 Apr 16 21:08:46.480053 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.480020 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerDied","Data":"cdc8aef1860c5a720754c9db9b94f3a8dd8df902830a039cbafd1c7441be119e"} Apr 16 21:08:46.480091 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.480067 2563 scope.go:117] "RemoveContainer" containerID="eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122" Apr 16 21:08:46.480382 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.480357 2563 scope.go:117] "RemoveContainer" containerID="eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122" Apr 16 21:08:46.490424 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:08:46.490398 2563 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_kserve-ci-e2e-test_2c4fa061-72b4-4bff-aebf-2b21e34272c9_0 in pod sandbox 6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5 from index: no such id: 'eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122'" containerID="eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122" Apr 16 21:08:46.490498 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:46.490433 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_kserve-ci-e2e-test_2c4fa061-72b4-4bff-aebf-2b21e34272c9_0 in pod sandbox 6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5 from index: no such id: 'eb6f3e54032afc3e8aa480e876cb3965ce3f88d5bec6a40e8bc3912cf05c3122'" Apr 16 21:08:46.490617 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:08:46.490596 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_kserve-ci-e2e-test(2c4fa061-72b4-4bff-aebf-2b21e34272c9)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" Apr 16 21:08:47.484456 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:47.484431 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/1.log" Apr 16 21:08:47.785052 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:47.784969 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:47.911895 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:47.911870 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/1.log" Apr 16 21:08:47.912073 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:47.911930 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:48.003167 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.003142 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location\") pod \"2c4fa061-72b4-4bff-aebf-2b21e34272c9\" (UID: \"2c4fa061-72b4-4bff-aebf-2b21e34272c9\") " Apr 16 21:08:48.003385 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.003364 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2c4fa061-72b4-4bff-aebf-2b21e34272c9" (UID: "2c4fa061-72b4-4bff-aebf-2b21e34272c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:08:48.104601 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.104542 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2c4fa061-72b4-4bff-aebf-2b21e34272c9-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:08:48.488597 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.488551 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw_2c4fa061-72b4-4bff-aebf-2b21e34272c9/storage-initializer/1.log" Apr 16 21:08:48.489008 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.488674 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" Apr 16 21:08:48.489008 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.488699 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw" event={"ID":"2c4fa061-72b4-4bff-aebf-2b21e34272c9","Type":"ContainerDied","Data":"6b4d17e00d68bf4a935a40796c66a5dd2515b30b1c49d3347399e41422dd8cc5"} Apr 16 21:08:48.489008 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.488744 2563 scope.go:117] "RemoveContainer" containerID="cdc8aef1860c5a720754c9db9b94f3a8dd8df902830a039cbafd1c7441be119e" Apr 16 21:08:48.522046 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.522022 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:48.525522 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.525501 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-b8598767b-srqmw"] Apr 16 21:08:48.857293 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857212 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:08:48.857633 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857614 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857635 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857652 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="storage-initializer" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857660 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="storage-initializer" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857673 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857682 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857696 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.857737 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857703 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.858089 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857783 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0bbad4e-b9fe-4058-97f8-e630d415ee5e" containerName="kserve-container" Apr 16 21:08:48.858089 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857797 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.858089 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.857808 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" containerName="storage-initializer" Apr 16 21:08:48.862110 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.862091 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:48.864649 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.864628 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 21:08:48.864744 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.864664 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-757nb\"" Apr 16 21:08:48.864744 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.864733 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 21:08:48.868498 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.868478 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:08:48.909914 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.909890 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:48.910015 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:48.909965 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.010682 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.010660 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.010780 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.010693 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.011001 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.010983 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.011252 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.011234 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.173195 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.173173 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:49.293814 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.293780 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:08:49.297130 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:08:49.297101 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74474974_5930_48f3_a895_f25c608ebbfa.slice/crio-8ce97297369121278c20c5ff1763557a3cb3a6bfff94f4055775df2c8681f5aa WatchSource:0}: Error finding container 8ce97297369121278c20c5ff1763557a3cb3a6bfff94f4055775df2c8681f5aa: Status 404 returned error can't find the container with id 8ce97297369121278c20c5ff1763557a3cb3a6bfff94f4055775df2c8681f5aa Apr 16 21:08:49.301024 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.301000 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c4fa061-72b4-4bff-aebf-2b21e34272c9" path="/var/lib/kubelet/pods/2c4fa061-72b4-4bff-aebf-2b21e34272c9/volumes" Apr 16 21:08:49.496268 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.496186 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerStarted","Data":"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e"} Apr 16 21:08:49.496268 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:49.496230 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerStarted","Data":"8ce97297369121278c20c5ff1763557a3cb3a6bfff94f4055775df2c8681f5aa"} Apr 16 21:08:50.500483 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:50.500397 2563 generic.go:358] "Generic (PLEG): container finished" podID="74474974-5930-48f3-a895-f25c608ebbfa" containerID="1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e" exitCode=0 Apr 16 21:08:50.500483 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:50.500450 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerDied","Data":"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e"} Apr 16 21:08:51.505512 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:51.505478 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerStarted","Data":"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466"} Apr 16 21:08:51.505936 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:51.505687 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:08:51.506865 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:51.506838 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:08:51.525973 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:51.525935 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podStartSLOduration=3.525923035 podStartE2EDuration="3.525923035s" podCreationTimestamp="2026-04-16 21:08:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:08:51.523852948 +0000 UTC m=+3428.812778737" watchObservedRunningTime="2026-04-16 21:08:51.525923035 +0000 UTC m=+3428.814848819" Apr 16 21:08:52.509286 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:08:52.509247 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:02.509752 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:02.509712 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:12.509539 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:12.509493 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:22.509979 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:22.509935 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:32.509417 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:32.509382 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:42.509234 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:42.509191 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:09:52.510286 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:52.510252 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:09:58.887306 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:58.887268 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:09:58.887703 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:58.887535 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" containerID="cri-o://53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466" gracePeriod=30 Apr 16 21:09:59.958476 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:59.958439 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:09:59.961810 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:59.961793 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:09:59.969193 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:09:59.969170 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:10:00.057155 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.057119 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v\" (UID: \"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:00.162578 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.157919 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v\" (UID: \"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:00.162578 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.158446 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v\" (UID: \"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:00.273980 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.273904 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:00.394142 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.394120 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:10:00.396233 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:10:00.396206 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2ee75eb_e5ec_4ef3_b7ef_221cc68c7867.slice/crio-5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb WatchSource:0}: Error finding container 5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb: Status 404 returned error can't find the container with id 5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb Apr 16 21:10:00.713549 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.713513 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerStarted","Data":"2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476"} Apr 16 21:10:00.713751 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:00.713570 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerStarted","Data":"5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb"} Apr 16 21:10:02.510323 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:02.510230 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.70:8080: connect: connection refused" Apr 16 21:10:03.091696 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.091640 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:10:03.180106 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.180067 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location\") pod \"74474974-5930-48f3-a895-f25c608ebbfa\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " Apr 16 21:10:03.180268 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.180125 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert\") pod \"74474974-5930-48f3-a895-f25c608ebbfa\" (UID: \"74474974-5930-48f3-a895-f25c608ebbfa\") " Apr 16 21:10:03.180417 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.180343 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "74474974-5930-48f3-a895-f25c608ebbfa" (UID: "74474974-5930-48f3-a895-f25c608ebbfa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:10:03.180488 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.180471 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "74474974-5930-48f3-a895-f25c608ebbfa" (UID: "74474974-5930-48f3-a895-f25c608ebbfa"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:10:03.281692 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.281662 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/74474974-5930-48f3-a895-f25c608ebbfa-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:10:03.281692 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.281689 2563 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/74474974-5930-48f3-a895-f25c608ebbfa-cabundle-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:10:03.726794 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.726756 2563 generic.go:358] "Generic (PLEG): container finished" podID="74474974-5930-48f3-a895-f25c608ebbfa" containerID="53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466" exitCode=0 Apr 16 21:10:03.727260 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.726831 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" Apr 16 21:10:03.727260 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.726832 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerDied","Data":"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466"} Apr 16 21:10:03.727260 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.726877 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw" event={"ID":"74474974-5930-48f3-a895-f25c608ebbfa","Type":"ContainerDied","Data":"8ce97297369121278c20c5ff1763557a3cb3a6bfff94f4055775df2c8681f5aa"} Apr 16 21:10:03.727260 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.726898 2563 scope.go:117] "RemoveContainer" containerID="53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466" Apr 16 21:10:03.728253 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.728226 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/0.log" Apr 16 21:10:03.728378 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.728262 2563 generic.go:358] "Generic (PLEG): container finished" podID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerID="2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476" exitCode=1 Apr 16 21:10:03.728378 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.728317 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerDied","Data":"2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476"} Apr 16 21:10:03.735106 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.735061 2563 scope.go:117] "RemoveContainer" containerID="1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e" Apr 16 21:10:03.742288 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.742220 2563 scope.go:117] "RemoveContainer" containerID="53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466" Apr 16 21:10:03.742528 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:10:03.742504 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466\": container with ID starting with 53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466 not found: ID does not exist" containerID="53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466" Apr 16 21:10:03.742619 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.742538 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466"} err="failed to get container status \"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466\": rpc error: code = NotFound desc = could not find container \"53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466\": container with ID starting with 53c53064cfdd73c5122598054442b5b88f88a0f1029fc5ac81595e0ffeba8466 not found: ID does not exist" Apr 16 21:10:03.742619 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.742577 2563 scope.go:117] "RemoveContainer" containerID="1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e" Apr 16 21:10:03.742818 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:10:03.742793 2563 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e\": container with ID starting with 1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e not found: ID does not exist" containerID="1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e" Apr 16 21:10:03.742928 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.742826 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e"} err="failed to get container status \"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e\": rpc error: code = NotFound desc = could not find container \"1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e\": container with ID starting with 1529cd9e88bf7b52855e1e7ef89cea86723809efa11c56af2f021fd660a04c5e not found: ID does not exist" Apr 16 21:10:03.743613 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.743594 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:10:03.748941 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:03.748918 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-685cf5bd74-jbggw"] Apr 16 21:10:04.733086 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:04.733057 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/0.log" Apr 16 21:10:04.733537 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:04.733158 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerStarted","Data":"690774a698fc0eb05b51533dd0e4674f086f89ee3c2b88715c27ee6bdd17ba5d"} Apr 16 21:10:05.298671 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:05.298640 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74474974-5930-48f3-a895-f25c608ebbfa" path="/var/lib/kubelet/pods/74474974-5930-48f3-a895-f25c608ebbfa/volumes" Apr 16 21:10:07.742799 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.742772 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/1.log" Apr 16 21:10:07.743175 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.743132 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/0.log" Apr 16 21:10:07.743175 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.743164 2563 generic.go:358] "Generic (PLEG): container finished" podID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerID="690774a698fc0eb05b51533dd0e4674f086f89ee3c2b88715c27ee6bdd17ba5d" exitCode=1 Apr 16 21:10:07.743262 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.743240 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerDied","Data":"690774a698fc0eb05b51533dd0e4674f086f89ee3c2b88715c27ee6bdd17ba5d"} Apr 16 21:10:07.743299 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.743283 2563 scope.go:117] "RemoveContainer" containerID="2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476" Apr 16 21:10:07.743619 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:07.743604 2563 scope.go:117] "RemoveContainer" containerID="2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476" Apr 16 21:10:07.753591 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:10:07.753546 2563 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_kserve-ci-e2e-test_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867_0 in pod sandbox 5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb from index: no such id: '2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476'" containerID="2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476" Apr 16 21:10:07.753660 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:10:07.753605 2563 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_kserve-ci-e2e-test_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867_0 in pod sandbox 5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb from index: no such id: '2ff17b9a45f1099f00a1c3affe5461df952ea1625bd313355faae221e0d2c476'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_kserve-ci-e2e-test(a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867)\"" logger="UnhandledError" Apr 16 21:10:07.754906 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:10:07.754885 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_kserve-ci-e2e-test(a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" Apr 16 21:10:08.747398 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:08.747367 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/1.log" Apr 16 21:10:09.962623 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:09.962591 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:10:10.085294 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.085271 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/1.log" Apr 16 21:10:10.085398 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.085329 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:10.131829 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.131802 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location\") pod \"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867\" (UID: \"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867\") " Apr 16 21:10:10.132044 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.132025 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" (UID: "a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:10:10.233017 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.232958 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:10:10.754460 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.754435 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v_a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/storage-initializer/1.log" Apr 16 21:10:10.754664 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.754528 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" event={"ID":"a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867","Type":"ContainerDied","Data":"5fa705e4c0f34a805a7a636c897ec84f02ac1dd1b0b576cc365d3b8f4f0e44eb"} Apr 16 21:10:10.754664 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.754574 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v" Apr 16 21:10:10.754794 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.754582 2563 scope.go:117] "RemoveContainer" containerID="690774a698fc0eb05b51533dd0e4674f086f89ee3c2b88715c27ee6bdd17ba5d" Apr 16 21:10:10.791245 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.791221 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:10:10.794952 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:10.794931 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-68d779cb64-phl8v"] Apr 16 21:10:11.059846 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.059769 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060059 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060071 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060082 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060087 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060105 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060111 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060165 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="74474974-5930-48f3-a895-f25c608ebbfa" containerName="kserve-container" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060172 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060180 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060225 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.060254 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.060231 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" containerName="storage-initializer" Apr 16 21:10:11.064596 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.064552 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.067269 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.067251 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 21:10:11.067972 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.067953 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 21:10:11.068079 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.067991 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-757nb\"" Apr 16 21:10:11.079831 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.079804 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:10:11.138693 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.138663 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.138785 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.138724 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.239960 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.239939 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.240079 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.239996 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.240281 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.240263 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.240551 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.240530 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.298671 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.298644 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867" path="/var/lib/kubelet/pods/a2ee75eb-e5ec-4ef3-b7ef-221cc68c7867/volumes" Apr 16 21:10:11.381343 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.381286 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:11.503933 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.503903 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:10:11.506970 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:10:11.506941 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod032cbcb2_b329_4a72_9878_68273b138a1d.slice/crio-8f7447a6ac2ea7bf62805a73db1b94dc885258384075b15a2fe7e7aac54e739f WatchSource:0}: Error finding container 8f7447a6ac2ea7bf62805a73db1b94dc885258384075b15a2fe7e7aac54e739f: Status 404 returned error can't find the container with id 8f7447a6ac2ea7bf62805a73db1b94dc885258384075b15a2fe7e7aac54e739f Apr 16 21:10:11.759282 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.759241 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerStarted","Data":"7e4bd19edb5ded6b1155856b3fe4f72524b66a2a65182520e08f924d7e96d97d"} Apr 16 21:10:11.759282 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:11.759281 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerStarted","Data":"8f7447a6ac2ea7bf62805a73db1b94dc885258384075b15a2fe7e7aac54e739f"} Apr 16 21:10:12.763363 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:12.763331 2563 generic.go:358] "Generic (PLEG): container finished" podID="032cbcb2-b329-4a72-9878-68273b138a1d" containerID="7e4bd19edb5ded6b1155856b3fe4f72524b66a2a65182520e08f924d7e96d97d" exitCode=0 Apr 16 21:10:12.763751 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:12.763422 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerDied","Data":"7e4bd19edb5ded6b1155856b3fe4f72524b66a2a65182520e08f924d7e96d97d"} Apr 16 21:10:13.767942 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:13.767906 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerStarted","Data":"fcaa6599a9599ad3e1f88bd172e76a696a8522b889e5162e184fe85f276b3974"} Apr 16 21:10:13.768477 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:13.768129 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:10:13.769484 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:13.769457 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:10:13.785993 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:13.785950 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podStartSLOduration=2.785938777 podStartE2EDuration="2.785938777s" podCreationTimestamp="2026-04-16 21:10:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:10:13.784506031 +0000 UTC m=+3511.073431829" watchObservedRunningTime="2026-04-16 21:10:13.785938777 +0000 UTC m=+3511.074864562" Apr 16 21:10:14.771181 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:14.771146 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:10:24.772100 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:24.772057 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:10:34.771999 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:34.771952 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:10:44.771907 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:44.771861 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:10:54.771687 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:10:54.771643 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:11:04.772134 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:04.772087 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:11:14.772734 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:14.772697 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:11:21.073166 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:21.073135 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:11:21.073962 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:21.073399 2563 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" containerID="cri-o://fcaa6599a9599ad3e1f88bd172e76a696a8522b889e5162e184fe85f276b3974" gracePeriod=30 Apr 16 21:11:22.125930 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.125895 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:22.129238 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.129214 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:22.137467 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.137440 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:22.291139 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.291103 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw\" (UID: \"770bf67c-b1a0-4953-b82f-f765c43a8226\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:22.392263 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.392178 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw\" (UID: \"770bf67c-b1a0-4953-b82f-f765c43a8226\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:22.392605 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.392584 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw\" (UID: \"770bf67c-b1a0-4953-b82f-f765c43a8226\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:22.442205 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.442175 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:22.560621 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.560598 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:22.562643 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:11:22.562616 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod770bf67c_b1a0_4953_b82f_f765c43a8226.slice/crio-31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae WatchSource:0}: Error finding container 31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae: Status 404 returned error can't find the container with id 31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae Apr 16 21:11:22.971831 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.971791 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerStarted","Data":"5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5"} Apr 16 21:11:22.972008 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:22.971835 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerStarted","Data":"31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae"} Apr 16 21:11:24.771810 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:24.771766 2563 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.72:8080: connect: connection refused" Apr 16 21:11:24.981847 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:24.981816 2563 generic.go:358] "Generic (PLEG): container finished" podID="032cbcb2-b329-4a72-9878-68273b138a1d" containerID="fcaa6599a9599ad3e1f88bd172e76a696a8522b889e5162e184fe85f276b3974" exitCode=0 Apr 16 21:11:24.981992 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:24.981872 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerDied","Data":"fcaa6599a9599ad3e1f88bd172e76a696a8522b889e5162e184fe85f276b3974"} Apr 16 21:11:25.011633 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.011611 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:11:25.114710 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.114646 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert\") pod \"032cbcb2-b329-4a72-9878-68273b138a1d\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " Apr 16 21:11:25.114852 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.114715 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location\") pod \"032cbcb2-b329-4a72-9878-68273b138a1d\" (UID: \"032cbcb2-b329-4a72-9878-68273b138a1d\") " Apr 16 21:11:25.114991 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.114969 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "032cbcb2-b329-4a72-9878-68273b138a1d" (UID: "032cbcb2-b329-4a72-9878-68273b138a1d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:11:25.115065 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.115016 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "032cbcb2-b329-4a72-9878-68273b138a1d" (UID: "032cbcb2-b329-4a72-9878-68273b138a1d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 21:11:25.215487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.215464 2563 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/032cbcb2-b329-4a72-9878-68273b138a1d-cabundle-cert\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:11:25.215487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.215486 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/032cbcb2-b329-4a72-9878-68273b138a1d-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:11:25.986661 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.986634 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" Apr 16 21:11:25.987040 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.986631 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw" event={"ID":"032cbcb2-b329-4a72-9878-68273b138a1d","Type":"ContainerDied","Data":"8f7447a6ac2ea7bf62805a73db1b94dc885258384075b15a2fe7e7aac54e739f"} Apr 16 21:11:25.987040 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.986752 2563 scope.go:117] "RemoveContainer" containerID="fcaa6599a9599ad3e1f88bd172e76a696a8522b889e5162e184fe85f276b3974" Apr 16 21:11:25.988111 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.988095 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/0.log" Apr 16 21:11:25.988234 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.988126 2563 generic.go:358] "Generic (PLEG): container finished" podID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerID="5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5" exitCode=1 Apr 16 21:11:25.988234 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.988153 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerDied","Data":"5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5"} Apr 16 21:11:25.994175 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:25.994161 2563 scope.go:117] "RemoveContainer" containerID="7e4bd19edb5ded6b1155856b3fe4f72524b66a2a65182520e08f924d7e96d97d" Apr 16 21:11:26.005542 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:26.005523 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:11:26.009636 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:26.009614 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-7c75b57d87-pb8bw"] Apr 16 21:11:26.993992 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:26.993964 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/0.log" Apr 16 21:11:26.994364 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:26.994014 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerStarted","Data":"21c24ef6deef5826b901d78b1d3df567d64e772e8e5d5cd49fe6b311eb6b7dd2"} Apr 16 21:11:27.299651 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:27.299552 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" path="/var/lib/kubelet/pods/032cbcb2-b329-4a72-9878-68273b138a1d/volumes" Apr 16 21:11:32.012061 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012031 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/1.log" Apr 16 21:11:32.012481 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012378 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/0.log" Apr 16 21:11:32.012481 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012408 2563 generic.go:358] "Generic (PLEG): container finished" podID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerID="21c24ef6deef5826b901d78b1d3df567d64e772e8e5d5cd49fe6b311eb6b7dd2" exitCode=1 Apr 16 21:11:32.012601 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012487 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerDied","Data":"21c24ef6deef5826b901d78b1d3df567d64e772e8e5d5cd49fe6b311eb6b7dd2"} Apr 16 21:11:32.012601 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012534 2563 scope.go:117] "RemoveContainer" containerID="5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5" Apr 16 21:11:32.012870 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.012850 2563 scope.go:117] "RemoveContainer" containerID="5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5" Apr 16 21:11:32.023145 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:11:32.023120 2563 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_kserve-ci-e2e-test_770bf67c-b1a0-4953-b82f-f765c43a8226_0 in pod sandbox 31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae from index: no such id: '5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5'" containerID="5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5" Apr 16 21:11:32.023207 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.023152 2563 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_kserve-ci-e2e-test_770bf67c-b1a0-4953-b82f-f765c43a8226_0 in pod sandbox 31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae from index: no such id: '5210bd929dae45b5b13b867835bc1df819d0e484685498cf3f61fdeec4bc13c5'" Apr 16 21:11:32.023292 ip-10-0-138-118 kubenswrapper[2563]: E0416 21:11:32.023271 2563 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_kserve-ci-e2e-test(770bf67c-b1a0-4953-b82f-f765c43a8226)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" Apr 16 21:11:32.132866 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:32.132803 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:33.016752 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.016724 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/1.log" Apr 16 21:11:33.138780 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.138761 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/1.log" Apr 16 21:11:33.138867 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.138818 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:33.170942 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.170921 2563 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location\") pod \"770bf67c-b1a0-4953-b82f-f765c43a8226\" (UID: \"770bf67c-b1a0-4953-b82f-f765c43a8226\") " Apr 16 21:11:33.171186 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.171166 2563 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "770bf67c-b1a0-4953-b82f-f765c43a8226" (UID: "770bf67c-b1a0-4953-b82f-f765c43a8226"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 21:11:33.271496 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:33.271432 2563 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/770bf67c-b1a0-4953-b82f-f765c43a8226-kserve-provision-location\") on node \"ip-10-0-138-118.ec2.internal\" DevicePath \"\"" Apr 16 21:11:34.021239 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.021210 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw_770bf67c-b1a0-4953-b82f-f765c43a8226/storage-initializer/1.log" Apr 16 21:11:34.021710 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.021321 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" event={"ID":"770bf67c-b1a0-4953-b82f-f765c43a8226","Type":"ContainerDied","Data":"31abcf6eed5b02e24c1206252fc4169b2e106f4a8c1b9c1e4adcdf9257f262ae"} Apr 16 21:11:34.021710 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.021339 2563 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw" Apr 16 21:11:34.021710 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.021355 2563 scope.go:117] "RemoveContainer" containerID="21c24ef6deef5826b901d78b1d3df567d64e772e8e5d5cd49fe6b311eb6b7dd2" Apr 16 21:11:34.050427 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.050402 2563 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:34.053893 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:34.053870 2563 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-6784d5b89c-f79xw"] Apr 16 21:11:35.298272 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:11:35.298241 2563 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" path="/var/lib/kubelet/pods/770bf67c-b1a0-4953-b82f-f765c43a8226/volumes" Apr 16 21:12:00.202049 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202014 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g6jdf/must-gather-6nbb9"] Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202333 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202344 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202357 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202363 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202370 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202376 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202386 2563 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202392 2563 state_mem.go:107] "Deleted CPUSet assignment" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202453 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="032cbcb2-b329-4a72-9878-68273b138a1d" containerName="kserve-container" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202463 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.202487 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.202470 2563 memory_manager.go:356] "RemoveStaleState removing state" podUID="770bf67c-b1a0-4953-b82f-f765c43a8226" containerName="storage-initializer" Apr 16 21:12:00.205176 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.205160 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.207634 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.207605 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"kube-root-ca.crt\"" Apr 16 21:12:00.207781 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.207708 2563 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g6jdf\"/\"openshift-service-ca.crt\"" Apr 16 21:12:00.208872 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.208850 2563 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g6jdf\"/\"default-dockercfg-w7zrq\"" Apr 16 21:12:00.211122 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.211101 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/must-gather-6nbb9"] Apr 16 21:12:00.263661 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.263638 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d806786c-e34c-4252-a46e-c6120c6f18ce-must-gather-output\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.263753 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.263683 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44b2l\" (UniqueName: \"kubernetes.io/projected/d806786c-e34c-4252-a46e-c6120c6f18ce-kube-api-access-44b2l\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.364666 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.364638 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d806786c-e34c-4252-a46e-c6120c6f18ce-must-gather-output\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.364763 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.364691 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-44b2l\" (UniqueName: \"kubernetes.io/projected/d806786c-e34c-4252-a46e-c6120c6f18ce-kube-api-access-44b2l\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.364949 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.364931 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d806786c-e34c-4252-a46e-c6120c6f18ce-must-gather-output\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.378352 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.378331 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-44b2l\" (UniqueName: \"kubernetes.io/projected/d806786c-e34c-4252-a46e-c6120c6f18ce-kube-api-access-44b2l\") pod \"must-gather-6nbb9\" (UID: \"d806786c-e34c-4252-a46e-c6120c6f18ce\") " pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.514637 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.514540 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" Apr 16 21:12:00.634907 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:00.634879 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/must-gather-6nbb9"] Apr 16 21:12:00.637570 ip-10-0-138-118 kubenswrapper[2563]: W0416 21:12:00.637520 2563 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd806786c_e34c_4252_a46e_c6120c6f18ce.slice/crio-5ec04ee08c962a1ba2d29846c771eff5ad4cdf97198fc6b95d6d8a108ce0290c WatchSource:0}: Error finding container 5ec04ee08c962a1ba2d29846c771eff5ad4cdf97198fc6b95d6d8a108ce0290c: Status 404 returned error can't find the container with id 5ec04ee08c962a1ba2d29846c771eff5ad4cdf97198fc6b95d6d8a108ce0290c Apr 16 21:12:01.110636 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:01.110597 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" event={"ID":"d806786c-e34c-4252-a46e-c6120c6f18ce","Type":"ContainerStarted","Data":"5ec04ee08c962a1ba2d29846c771eff5ad4cdf97198fc6b95d6d8a108ce0290c"} Apr 16 21:12:02.115580 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:02.115531 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" event={"ID":"d806786c-e34c-4252-a46e-c6120c6f18ce","Type":"ContainerStarted","Data":"b8edeade3fd1315c883da843e3e84875d9ccbb959641b02e476a9e9217a4e13f"} Apr 16 21:12:02.115580 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:02.115581 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" event={"ID":"d806786c-e34c-4252-a46e-c6120c6f18ce","Type":"ContainerStarted","Data":"b09a5cf34755f41b9bd942fcc043c8cea6f14fb63bd51405ea7c5772f18198d0"} Apr 16 21:12:02.132369 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:02.132310 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g6jdf/must-gather-6nbb9" podStartSLOduration=1.254684196 podStartE2EDuration="2.132295758s" podCreationTimestamp="2026-04-16 21:12:00 +0000 UTC" firstStartedPulling="2026-04-16 21:12:00.639355226 +0000 UTC m=+3617.928280990" lastFinishedPulling="2026-04-16 21:12:01.516966777 +0000 UTC m=+3618.805892552" observedRunningTime="2026-04-16 21:12:02.129819236 +0000 UTC m=+3619.418745023" watchObservedRunningTime="2026-04-16 21:12:02.132295758 +0000 UTC m=+3619.421221542" Apr 16 21:12:03.102449 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:03.102411 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-87vlr_f5c86052-4f3b-4b92-9618-f53193a55301/global-pull-secret-syncer/0.log" Apr 16 21:12:03.294477 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:03.294442 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tgsc9_2d5dcfae-693b-4c79-8475-17131d139947/konnectivity-agent/0.log" Apr 16 21:12:03.345707 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:03.345675 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-118.ec2.internal_00a800bd34a7cee6861a5791d3f97be3/haproxy/0.log" Apr 16 21:12:06.792072 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:06.792039 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xqpsk_03466f36-98b0-4673-b128-7e1d176ae32d/kube-state-metrics/0.log" Apr 16 21:12:06.814906 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:06.814830 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xqpsk_03466f36-98b0-4673-b128-7e1d176ae32d/kube-rbac-proxy-main/0.log" Apr 16 21:12:06.839662 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:06.839636 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-xqpsk_03466f36-98b0-4673-b128-7e1d176ae32d/kube-rbac-proxy-self/0.log" Apr 16 21:12:06.868542 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:06.868509 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-74f849f78c-cnf9b_aee138ef-22a8-4713-af50-26f151c86fe4/metrics-server/0.log" Apr 16 21:12:06.896747 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:06.896652 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-jxh4c_39ac0b69-dd53-4c7e-92bc-df176b8bd42e/monitoring-plugin/0.log" Apr 16 21:12:07.006049 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.006013 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ggss5_a7888091-f34c-4d28-a756-9d03c77ffcb3/node-exporter/0.log" Apr 16 21:12:07.028243 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.028207 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ggss5_a7888091-f34c-4d28-a756-9d03c77ffcb3/kube-rbac-proxy/0.log" Apr 16 21:12:07.048965 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.048892 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-ggss5_a7888091-f34c-4d28-a756-9d03c77ffcb3/init-textfile/0.log" Apr 16 21:12:07.155057 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.155027 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x9qlc_e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43/kube-rbac-proxy-main/0.log" Apr 16 21:12:07.177999 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.177970 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x9qlc_e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43/kube-rbac-proxy-self/0.log" Apr 16 21:12:07.204295 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.204265 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-x9qlc_e6c4b40f-dbb0-49dd-ba5d-a94904cf2f43/openshift-state-metrics/0.log" Apr 16 21:12:07.449925 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.449895 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-9kq57_6fd62f91-dfe1-46dd-bfd3-4dd5590d3d5b/prometheus-operator-admission-webhook/0.log" Apr 16 21:12:07.566878 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.566841 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/thanos-query/0.log" Apr 16 21:12:07.588998 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.588966 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/kube-rbac-proxy-web/0.log" Apr 16 21:12:07.611913 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.611886 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/kube-rbac-proxy/0.log" Apr 16 21:12:07.634087 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.634051 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/prom-label-proxy/0.log" Apr 16 21:12:07.659498 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.659470 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/kube-rbac-proxy-rules/0.log" Apr 16 21:12:07.681883 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:07.681852 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-566fc79bc9-s642c_17f468e0-dcb6-4c92-b02e-049ac8f25e1f/kube-rbac-proxy-metrics/0.log" Apr 16 21:12:08.867019 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:08.866984 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-nvxzk_eee002f1-2d78-4f01-b6c8-7f7b9567f19b/networking-console-plugin/0.log" Apr 16 21:12:09.620824 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:09.620786 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-557c549d6c-nmprp_9aaa9c1c-15de-4870-a54f-2ad2e97034e2/console/0.log" Apr 16 21:12:10.099118 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.099070 2563 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm"] Apr 16 21:12:10.105149 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.105119 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.114666 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.114641 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm"] Apr 16 21:12:10.160441 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.160406 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-sys\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.160602 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.160457 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-proc\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.160602 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.160540 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-lib-modules\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.160602 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.160588 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxqjz\" (UniqueName: \"kubernetes.io/projected/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-kube-api-access-xxqjz\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.160751 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.160706 2563 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-podres\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.261551 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.261517 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-podres\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.261745 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.261594 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-sys\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.261745 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.261637 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-proc\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.261745 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.261671 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-lib-modules\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.261745 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.261693 2563 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxqjz\" (UniqueName: \"kubernetes.io/projected/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-kube-api-access-xxqjz\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.262112 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.262084 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-sys\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.262199 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.262111 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-podres\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.262199 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.262096 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-proc\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.262199 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.262157 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-lib-modules\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.270152 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.270127 2563 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxqjz\" (UniqueName: \"kubernetes.io/projected/0b8750e4-ab48-4cd8-90e2-21c74b05ebf4-kube-api-access-xxqjz\") pod \"perf-node-gather-daemonset-9lpfm\" (UID: \"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4\") " pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.422698 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.422663 2563 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:10.552914 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.552817 2563 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm"] Apr 16 21:12:10.795854 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.795779 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wfhdc_ed1e1b27-b156-463d-9ee6-eaa33682d57c/dns/0.log" Apr 16 21:12:10.822612 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.822585 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wfhdc_ed1e1b27-b156-463d-9ee6-eaa33682d57c/kube-rbac-proxy/0.log" Apr 16 21:12:10.851911 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:10.851888 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-bh4x8_48db7bb0-6c87-484f-b5df-58ae1720d8f9/dns-node-resolver/0.log" Apr 16 21:12:11.154039 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:11.154011 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" event={"ID":"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4","Type":"ContainerStarted","Data":"cb4cf3c7198c3c2ab759d8749d702a81b16bb453b4b6d3f26ea82ff818cba2b1"} Apr 16 21:12:11.154410 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:11.154046 2563 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" event={"ID":"0b8750e4-ab48-4cd8-90e2-21c74b05ebf4","Type":"ContainerStarted","Data":"05e08b76e572ac8f0d59b69c8e2ac6dd44e796c48a21d6b8868439d1d3bd51e7"} Apr 16 21:12:11.154410 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:11.154127 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:11.174540 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:11.174489 2563 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" podStartSLOduration=1.174473795 podStartE2EDuration="1.174473795s" podCreationTimestamp="2026-04-16 21:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 21:12:11.17212274 +0000 UTC m=+3628.461048527" watchObservedRunningTime="2026-04-16 21:12:11.174473795 +0000 UTC m=+3628.463399582" Apr 16 21:12:11.420780 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:11.420752 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-btw62_172b2c56-3bf0-4eef-aab2-4934181bce38/node-ca/0.log" Apr 16 21:12:12.541991 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:12.541960 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-qsrv5_d78ddec9-9c5c-40a0-b5b1-d748cb8a110c/serve-healthcheck-canary/0.log" Apr 16 21:12:12.995257 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:12.995228 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m64qv_77f4261b-6ea9-40a7-84ba-cdb40f512c02/kube-rbac-proxy/0.log" Apr 16 21:12:13.015365 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:13.015337 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m64qv_77f4261b-6ea9-40a7-84ba-cdb40f512c02/exporter/0.log" Apr 16 21:12:13.037923 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:13.037901 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-m64qv_77f4261b-6ea9-40a7-84ba-cdb40f512c02/extractor/0.log" Apr 16 21:12:15.429906 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:15.429875 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-g7mz4_b2a789e9-ca26-4724-a7f7-40f33ee87848/seaweedfs-tls-custom/0.log" Apr 16 21:12:17.166872 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:17.166845 2563 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g6jdf/perf-node-gather-daemonset-9lpfm" Apr 16 21:12:19.768417 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:19.768383 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v8mkb_49451082-0796-4a50-af11-a585eef9af8c/kube-storage-version-migrator-operator/1.log" Apr 16 21:12:19.771234 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:19.771192 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-v8mkb_49451082-0796-4a50-af11-a585eef9af8c/kube-storage-version-migrator-operator/0.log" Apr 16 21:12:20.901198 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:20.901166 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/kube-multus-additional-cni-plugins/0.log" Apr 16 21:12:20.926075 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:20.926050 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/egress-router-binary-copy/0.log" Apr 16 21:12:20.964755 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:20.964721 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/cni-plugins/0.log" Apr 16 21:12:20.997127 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:20.997105 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/bond-cni-plugin/0.log" Apr 16 21:12:21.024138 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.024109 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/routeoverride-cni/0.log" Apr 16 21:12:21.049163 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.049137 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/whereabouts-cni-bincopy/0.log" Apr 16 21:12:21.072857 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.072830 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-krhbv_02dbfbd4-16bb-4990-8e97-87e6ff7a47f1/whereabouts-cni/0.log" Apr 16 21:12:21.340667 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.340631 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-tnt6p_e7efb583-4245-4d53-b571-eaf057bac81b/kube-multus/0.log" Apr 16 21:12:21.457580 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.457529 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mx2qh_51782696-d22a-4882-9ad3-4de29c66583c/network-metrics-daemon/0.log" Apr 16 21:12:21.472786 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:21.472764 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-mx2qh_51782696-d22a-4882-9ad3-4de29c66583c/kube-rbac-proxy/0.log" Apr 16 21:12:22.732398 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.732364 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/ovn-controller/0.log" Apr 16 21:12:22.784315 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.784270 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/ovn-acl-logging/0.log" Apr 16 21:12:22.806907 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.806889 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/kube-rbac-proxy-node/0.log" Apr 16 21:12:22.829760 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.829723 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 21:12:22.852424 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.852376 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/northd/0.log" Apr 16 21:12:22.873947 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.873929 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/nbdb/0.log" Apr 16 21:12:22.897000 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:22.896969 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/sbdb/0.log" Apr 16 21:12:23.091626 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:23.091484 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qjb9s_3d468fb0-6c11-4fba-b1e4-ef75ae52d254/ovnkube-controller/0.log" Apr 16 21:12:24.081438 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:24.081408 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-t9s2d_973151c8-de39-4310-b2e4-204c7f502b48/check-endpoints/0.log" Apr 16 21:12:24.128276 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:24.128251 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ssfhx_34c9ee5c-a94b-41cc-8dc1-9d7ff0ef981a/network-check-target-container/0.log" Apr 16 21:12:25.002616 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:25.002583 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-6mc4g_0bd61e48-9d65-473a-b8b7-da6980e29685/iptables-alerter/0.log" Apr 16 21:12:25.704854 ip-10-0-138-118 kubenswrapper[2563]: I0416 21:12:25.704817 2563 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-x9vx5_cd72324a-d342-4d7b-8fde-e0e8a56bbe39/tuned/0.log"