Apr 16 13:56:43.213760 ip-10-0-139-151 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:43.213772 ip-10-0-139-151 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:43.213786 ip-10-0-139-151 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:43.214108 ip-10-0-139-151 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:56:54.389824 ip-10-0-139-151 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:56:54.389845 ip-10-0-139-151 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8c1bda5c1c674b549838907d6f75f6bc -- Apr 16 13:59:18.525885 ip-10-0-139-151 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:18.940923 ip-10-0-139-151 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:18.940923 ip-10-0-139-151 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:18.940923 ip-10-0-139-151 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:18.940923 ip-10-0-139-151 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:18.940923 ip-10-0-139-151 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:18.941842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.941759 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:18.947029 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947012 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:18.947029 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947029 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947033 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947036 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947041 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947045 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947049 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947051 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947054 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947057 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947060 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947063 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947066 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947068 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947071 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947074 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947076 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947079 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947082 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947084 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:18.947102 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947087 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947089 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947095 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947098 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947101 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947104 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947107 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947111 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947114 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947116 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947119 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947122 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947125 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947127 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947130 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947133 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947135 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947138 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947140 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947143 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:18.947556 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947145 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947148 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947150 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947153 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947155 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947158 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947160 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947164 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947168 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947172 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947175 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947178 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947180 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947183 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947186 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947189 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947192 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947195 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947197 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:18.948071 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947200 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947202 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947205 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947208 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947210 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947213 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947215 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947218 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947220 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947223 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947225 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947228 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947231 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947234 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947237 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947240 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947243 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947246 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947249 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947252 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:18.948523 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947254 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947257 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947259 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947262 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947265 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947268 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947271 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947675 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947680 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947684 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947687 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947690 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947693 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947696 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947698 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947701 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947704 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947706 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947709 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947712 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:18.949024 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947714 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947717 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947720 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947722 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947725 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947727 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947730 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947733 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947736 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947738 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947741 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947744 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947746 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947749 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947751 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947754 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947757 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947759 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947762 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947765 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:18.949507 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947769 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947772 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947774 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947777 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947779 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947783 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947785 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947788 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947791 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947793 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947796 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947798 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947801 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947804 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947806 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947808 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947811 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947814 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947817 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947819 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:18.950022 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947822 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947826 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947828 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947831 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947833 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947836 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947838 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947841 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947843 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947846 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947849 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947852 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947855 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947857 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947860 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947863 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947865 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947868 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947870 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:18.950510 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947873 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947875 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947879 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947883 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947886 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947889 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947891 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947894 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947896 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947899 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947902 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947904 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947907 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.947910 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.947985 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.947994 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948000 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948005 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948009 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948013 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948017 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:18.950996 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948021 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948024 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948027 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948032 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948035 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948038 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948042 2574 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948045 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948048 2574 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948051 2574 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948054 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948057 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948061 2574 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948064 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948067 2574 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948070 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948073 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948077 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948081 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948084 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948087 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948091 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948094 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948097 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948100 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:18.951513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948103 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948107 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948111 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948114 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948117 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948120 2574 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948123 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948128 2574 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948131 2574 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948134 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948138 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948141 2574 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948145 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948148 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948151 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948154 2574 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948157 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948160 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948163 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948166 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948169 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948174 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948177 2574 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948181 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948184 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:18.952162 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948187 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948190 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948194 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948197 2574 flags.go:64] FLAG: --help="false" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948200 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-139-151.ec2.internal" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948203 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948206 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948209 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948212 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948216 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948220 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948223 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948226 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948228 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948231 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948234 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948237 2574 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948241 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948243 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948247 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948250 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948252 2574 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948255 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948258 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:18.952842 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948261 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948266 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948270 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948273 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948277 2574 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948280 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948283 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948286 2574 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948289 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948293 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948296 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948300 2574 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948303 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948306 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948309 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948311 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948314 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948317 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948320 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948331 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948335 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948338 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948341 2574 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:18.953433 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948344 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948350 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948353 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948356 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948359 2574 flags.go:64] FLAG: --port="10250" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948362 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948365 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-074f3677e9c322d77" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948369 2574 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948372 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948375 2574 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948378 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948381 2574 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948384 2574 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948389 2574 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948392 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948395 2574 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948398 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948401 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948404 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948407 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948409 2574 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948412 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948415 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948418 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948421 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948424 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:18.954031 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948428 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948431 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948434 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948438 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948441 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948444 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948446 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948449 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948453 2574 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948457 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948462 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948465 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948468 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948472 2574 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948475 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948478 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948480 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948483 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948486 2574 flags.go:64] FLAG: --v="2" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948492 2574 flags.go:64] FLAG: --version="false" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948496 2574 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948500 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.948503 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948604 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948608 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:18.954693 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948612 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948615 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948618 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948621 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948623 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948626 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948629 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948631 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948634 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948637 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948639 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948642 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948645 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948648 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948650 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948653 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948657 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948661 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948665 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:18.955332 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948668 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948670 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948673 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948676 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948678 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948681 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948683 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948690 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948693 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948695 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948698 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948701 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948703 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948706 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948708 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948711 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948713 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948716 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948718 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948721 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:18.955831 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948724 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948726 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948729 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948731 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948734 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948737 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948740 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948742 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948745 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948749 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948752 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948754 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948757 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948759 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948762 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948765 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948767 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948770 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948772 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948776 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:18.956325 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948779 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948781 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948784 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948787 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948789 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948792 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948794 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948797 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948799 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948802 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948805 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948807 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948810 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948812 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948815 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948818 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948820 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948824 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948828 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:18.956824 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948831 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948834 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948838 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948841 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948844 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.948846 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.949332 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.955511 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.955525 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955602 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955608 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955611 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955614 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955616 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955619 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:18.957302 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955622 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955624 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955627 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955630 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955632 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955635 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955637 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955642 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955646 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955649 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955653 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955656 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955658 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955661 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955664 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955666 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955670 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955672 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955675 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955679 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:18.957709 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955683 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955686 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955689 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955691 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955694 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955697 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955700 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955703 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955706 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955709 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955711 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955714 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955717 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955719 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955722 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955725 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955727 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955730 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955733 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955735 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:18.958197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955738 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955740 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955743 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955745 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955748 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955750 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955753 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955755 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955758 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955761 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955764 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955766 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955769 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955771 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955774 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955777 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955780 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955783 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955785 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955788 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:18.958729 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955791 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955793 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955796 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955798 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955801 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955803 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955806 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955808 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955810 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955813 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955816 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955818 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955821 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955824 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955827 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955829 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955832 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955835 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955837 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:18.959241 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955840 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.955845 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955939 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955944 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955947 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955950 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955953 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955955 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955958 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955961 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955965 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955968 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955970 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955973 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955976 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:18.959726 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955978 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955981 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955983 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955986 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955989 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955991 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955994 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955997 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.955999 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956001 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956004 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956006 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956009 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956011 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956015 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956018 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956020 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956023 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956026 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:18.960107 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956028 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956030 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956034 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956036 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956039 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956041 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956044 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956047 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956050 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956053 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956055 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956060 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956063 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956066 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956069 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956072 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956074 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956077 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956079 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956082 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:18.960575 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956084 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956087 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956089 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956092 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956094 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956097 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956100 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956102 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956105 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956107 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956110 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956112 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956115 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956117 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956119 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956122 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956125 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956128 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956130 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956133 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:18.961197 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956136 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956138 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956140 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956143 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956145 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956148 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956150 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956153 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956155 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956158 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956160 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956163 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956166 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:18.956168 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.956173 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:18.961702 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.956275 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:18.962100 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.958149 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:18.962100 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.959099 2574 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:18.962100 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.959196 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:18.962100 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.959942 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:18.983237 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.983213 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:18.987402 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:18.987374 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:19.000300 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.000273 2574 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:19.007343 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.007324 2574 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:19.010695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.010673 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:19.013938 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.013918 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:19.014434 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.014414 2574 fs.go:135] Filesystem UUIDs: map[659eb8fe-bc71-4537-94e7-84e0fd5b02a2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 b550ad71-f8d8-4d1b-abc0-8d1686ca0420:/dev/nvme0n1p3] Apr 16 13:59:19.014481 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.014435 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:19.020184 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.020075 2574 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:19.018239847 +0000 UTC m=+0.379782605 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098227 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f8684ee754e1e9eb64031724608eb SystemUUID:ec2f8684-ee75-4e1e-9eb6-4031724608eb BootID:8c1bda5c-1c67-4b54-9838-907d6f75f6bc Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:42:60:a5:26:35 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:42:60:a5:26:35 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:9a:09:d1:68:82:03 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:19.020184 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.020177 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:19.020300 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.020260 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:19.021298 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.021276 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:19.021446 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.021300 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-139-151.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:19.021488 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.021455 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:19.021488 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.021463 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:19.021488 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.021480 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:19.022349 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.022338 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:19.023076 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.023066 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:19.023181 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.023172 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:19.025761 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.025752 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:19.026436 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.026427 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:19.026468 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.026450 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:19.026468 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.026465 2574 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:19.026521 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.026474 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:19.027628 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.027615 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:19.027686 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.027635 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:19.030965 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.030949 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:19.032559 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.032545 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:19.033954 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033939 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:19.033954 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033957 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033964 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033970 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033976 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033982 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033988 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.033994 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.034000 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.034007 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.034023 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:19.034050 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.034032 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:19.035774 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.035764 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:19.035774 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.035775 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:19.036989 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.036959 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-139-151.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:19.036989 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.036968 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:19.038364 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.038349 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hdrdr" Apr 16 13:59:19.039714 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.039702 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:19.039753 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.039740 2574 server.go:1295] "Started kubelet" Apr 16 13:59:19.039851 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.039827 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:19.039930 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.039885 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:19.039995 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.039982 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:19.040402 ip-10-0-139-151 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:19.041010 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.040998 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:19.042148 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.042135 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:19.046799 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.046773 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-hdrdr" Apr 16 13:59:19.048855 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.048835 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:19.048997 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.048841 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:19.049710 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.049686 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:19.049710 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.049712 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:19.049832 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.049723 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:19.049951 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.049936 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:19.049951 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.049951 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:19.050515 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050496 2574 factory.go:55] Registering systemd factory Apr 16 13:59:19.050515 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050517 2574 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:19.050914 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050851 2574 factory.go:153] Registering CRI-O factory Apr 16 13:59:19.050914 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050868 2574 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:19.051288 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050937 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:19.051288 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050965 2574 factory.go:103] Registering Raw factory Apr 16 13:59:19.051288 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.050979 2574 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:19.051288 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.051176 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.051923 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.051908 2574 manager.go:319] Starting recovery of all containers Apr 16 13:59:19.055359 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.055336 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:19.059224 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.059055 2574 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-139-151.ec2.internal\" not found" node="ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.062128 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.062108 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-151.ec2.internal" not found Apr 16 13:59:19.062878 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.062863 2574 manager.go:324] Recovery completed Apr 16 13:59:19.066867 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.066854 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.069611 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.069595 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.069690 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.069630 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.069690 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.069645 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.070196 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.070181 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:19.070246 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.070196 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:19.070246 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.070218 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:19.073309 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.073295 2574 policy_none.go:49] "None policy: Start" Apr 16 13:59:19.073346 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.073312 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:19.073346 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.073322 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:19.077883 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.077867 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-151.ec2.internal" not found Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112133 2574 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.112165 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112177 2574 server.go:85] "Starting device plugin registration server" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112472 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112507 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112619 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112692 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.112700 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.113362 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:19.123081 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.113402 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.135885 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.135862 2574 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-139-151.ec2.internal" not found Apr 16 13:59:19.181051 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.181014 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:19.182428 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.182410 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:19.182528 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.182440 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:19.182528 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.182458 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:19.182528 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.182465 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:19.182528 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.182500 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:19.185391 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.185369 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:19.213670 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.213607 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.214750 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.214732 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.214818 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.214767 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.216629 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.215051 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.216629 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.215111 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.225524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.225503 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.225659 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.225533 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-139-151.ec2.internal\": node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.243595 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.243567 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.282942 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.282888 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal"] Apr 16 13:59:19.283040 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.283004 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.285172 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.285157 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.285266 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.285186 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.285266 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.285198 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.287523 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.287510 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.287670 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.287656 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.287711 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.287687 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.288281 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288259 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.288373 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288287 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.288373 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288296 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.288373 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288303 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.288373 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288322 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.288373 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.288332 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.290961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.290945 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.291037 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.290969 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:19.291739 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.291722 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:19.291807 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.291749 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:19.291807 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.291760 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:19.317252 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.317230 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-151.ec2.internal\" not found" node="ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.321856 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.321838 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-139-151.ec2.internal\" not found" node="ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.343891 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.343866 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.351200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.351176 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.351283 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.351210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.351283 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.351229 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5729c9ff098ff2004acf3ccde20c30f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-151.ec2.internal\" (UID: \"c5729c9ff098ff2004acf3ccde20c30f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.444085 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.444058 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.451403 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.451456 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.451456 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5729c9ff098ff2004acf3ccde20c30f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-151.ec2.internal\" (UID: \"c5729c9ff098ff2004acf3ccde20c30f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.451553 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c5729c9ff098ff2004acf3ccde20c30f-config\") pod \"kube-apiserver-proxy-ip-10-0-139-151.ec2.internal\" (UID: \"c5729c9ff098ff2004acf3ccde20c30f\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.451553 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451479 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.451553 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.451509 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2bd4e5594979c64677900b97136c3521-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal\" (UID: \"2bd4e5594979c64677900b97136c3521\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.544836 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.544802 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.620327 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.620286 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.623770 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.623750 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:19.645099 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.645071 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.745666 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.745626 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.846145 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.846105 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.946757 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:19.946731 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:19.959171 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.959150 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:19.959318 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.959295 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:19.959355 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:19.959306 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:20.047727 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:20.047694 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:20.049897 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.049874 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:20.050367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.050341 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:19 +0000 UTC" deadline="2027-10-20 08:03:17.806192302 +0000 UTC" Apr 16 13:59:20.050419 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.050369 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13242h3m57.755826939s" Apr 16 13:59:20.070959 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.070933 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:20.090513 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.090484 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-7qcdj" Apr 16 13:59:20.098656 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.098596 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-7qcdj" Apr 16 13:59:20.148651 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:20.148625 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-139-151.ec2.internal\" not found" Apr 16 13:59:20.214500 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.214473 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:20.224179 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:20.224143 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5729c9ff098ff2004acf3ccde20c30f.slice/crio-a2349cd9aa4bdbdc03d74e10db76d7b3a0834ed4ec36e0738f5df9dabc7e97e6 WatchSource:0}: Error finding container a2349cd9aa4bdbdc03d74e10db76d7b3a0834ed4ec36e0738f5df9dabc7e97e6: Status 404 returned error can't find the container with id a2349cd9aa4bdbdc03d74e10db76d7b3a0834ed4ec36e0738f5df9dabc7e97e6 Apr 16 13:59:20.224866 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:20.224844 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bd4e5594979c64677900b97136c3521.slice/crio-dc042a111114febd71c640790016c37d9bb4bf22a12ce57fb538b40c6256dde8 WatchSource:0}: Error finding container dc042a111114febd71c640790016c37d9bb4bf22a12ce57fb538b40c6256dde8: Status 404 returned error can't find the container with id dc042a111114febd71c640790016c37d9bb4bf22a12ce57fb538b40c6256dde8 Apr 16 13:59:20.228670 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.228656 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:20.249006 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.248980 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" Apr 16 13:59:20.260632 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.260612 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:20.262210 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.262198 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" Apr 16 13:59:20.268760 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.268746 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:20.451160 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:20.451081 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.028200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.028167 2574 apiserver.go:52] "Watching apiserver" Apr 16 13:59:21.034348 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.034323 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:21.035791 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.035763 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-c2sdt","openshift-network-operator/iptables-alerter-zzzpm","openshift-ovn-kubernetes/ovnkube-node-l6b5l","kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal","openshift-image-registry/node-ca-xzrpt","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal","openshift-multus/network-metrics-daemon-fbnhb","openshift-network-diagnostics/network-check-target-vl5lw","kube-system/konnectivity-agent-l76kn","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs","openshift-cluster-node-tuning-operator/tuned-sfmlv","openshift-dns/node-resolver-65gv2","openshift-multus/multus-44586"] Apr 16 13:59:21.038263 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.038243 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.040467 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.040413 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.040564 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.040493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.040843 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.040715 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:21.040843 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.040744 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.040843 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.040722 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-5rgm5\"" Apr 16 13:59:21.043110 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.043087 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.044020 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.043853 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.044020 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.043906 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:21.044020 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.043920 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-htdc2\"" Apr 16 13:59:21.044020 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.043927 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.046524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046306 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.046524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046307 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:21.046524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046360 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:21.046524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046370 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:21.046524 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046329 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cg5wp\"" Apr 16 13:59:21.046824 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046560 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.046824 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.046646 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:21.049045 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.049022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.049149 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.049134 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.049230 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.049207 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:21.051273 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.051254 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:21.051364 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.051313 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:21.052627 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.052605 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:21.053024 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.052890 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-lrkm6\"" Apr 16 13:59:21.053024 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.052933 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.053024 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.053018 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.053464 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.053446 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.055978 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.055930 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.056896 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.056683 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lmmnh\"" Apr 16 13:59:21.056896 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.056723 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:21.057040 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.056979 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:21.057646 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.057627 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:21.058395 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/567139c7-8d34-429b-bd38-0ab1aafa14e9-host\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.058395 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058314 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7khdc\" (UniqueName: \"kubernetes.io/projected/567139c7-8d34-429b-bd38-0ab1aafa14e9-kube-api-access-7khdc\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.058395 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfd2w\" (UniqueName: \"kubernetes.io/projected/2febdb01-c922-4ac4-81e6-2b92df909f85-kube-api-access-pfd2w\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.058395 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058347 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.058654 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058369 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-slash\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058654 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058598 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058654 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058647 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-bin\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058801 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-sys-fs\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.058801 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-log-socket\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058801 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058736 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-netd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058801 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058766 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058812 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-socket-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t97pk\" (UniqueName: \"kubernetes.io/projected/0a6ebd77-c55d-495e-acdf-6afc122a2621-kube-api-access-t97pk\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058922 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-netns\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058951 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467df738-bd26-4dba-b771-01c7f6844b70-ovn-node-metrics-cert\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.058976 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058974 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-script-lib\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.058992 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-device-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059031 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/567139c7-8d34-429b-bd38-0ab1aafa14e9-serviceca\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059079 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/3d0ab572-848b-495c-afdf-ad744ea2b230-kube-api-access-zwgwn\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059113 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-kubelet\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059158 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-systemd-units\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059181 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-systemd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059207 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-schp7\" (UniqueName: \"kubernetes.io/projected/467df738-bd26-4dba-b771-01c7f6844b70-kube-api-access-schp7\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059232 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-registration-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.059270 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059256 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2febdb01-c922-4ac4-81e6-2b92df909f85-host-slash\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059290 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-ovn\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059334 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-env-overrides\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059365 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-var-lib-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059414 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-node-log\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059483 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2febdb01-c922-4ac4-81e6-2b92df909f85-iptables-alerter-script\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059507 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-etc-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.059766 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.059531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-config\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.060706 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.060686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.060806 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.060759 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:21.060806 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.060801 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:21.061155 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.061137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m2ddc\"" Apr 16 13:59:21.062408 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.062389 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.062631 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.062594 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.062872 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.062854 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.063123 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.063104 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-mph7k\"" Apr 16 13:59:21.063123 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.063120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44586" Apr 16 13:59:21.063875 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.063725 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.063945 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.063923 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:21.064748 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.064726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cqpdx\"" Apr 16 13:59:21.065036 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.065015 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:21.065036 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.065027 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:21.065344 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.065325 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5kxlr\"" Apr 16 13:59:21.100311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.100282 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:20 +0000 UTC" deadline="2027-10-15 16:07:36.110909211 +0000 UTC" Apr 16 13:59:21.100311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.100309 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13130h8m15.010603129s" Apr 16 13:59:21.150408 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.150380 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:21.159751 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/567139c7-8d34-429b-bd38-0ab1aafa14e9-serviceca\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159764 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rf7z\" (UniqueName: \"kubernetes.io/projected/aa030d80-2a63-4669-acb7-9485b1b8ce4a-kube-api-access-2rf7z\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-multus\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-cnibin\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159848 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysconfig\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-systemd\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159891 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-var-lib-kubelet\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.159920 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159945 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2febdb01-c922-4ac4-81e6-2b92df909f85-iptables-alerter-script\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-etc-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.159999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-multus-certs\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-conf\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160087 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfd2w\" (UniqueName: \"kubernetes.io/projected/2febdb01-c922-4ac4-81e6-2b92df909f85-kube-api-access-pfd2w\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160114 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-etc-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f53d27c-7378-4e8b-8dfd-f39beb70f859-agent-certs\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.160248 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-tmp\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160284 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/567139c7-8d34-429b-bd38-0ab1aafa14e9-serviceca\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160309 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-sys-fs\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-sys-fs\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160372 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160380 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t97pk\" (UniqueName: \"kubernetes.io/projected/0a6ebd77-c55d-495e-acdf-6afc122a2621-kube-api-access-t97pk\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160439 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467df738-bd26-4dba-b771-01c7f6844b70-ovn-node-metrics-cert\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160503 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-os-release\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160528 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-conf-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/2febdb01-c922-4ac4-81e6-2b92df909f85-iptables-alerter-script\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160550 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-modprobe-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160572 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-run\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-lib-modules\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160635 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-device-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.160708 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160726 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-device-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160770 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-bin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160834 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-etc-selinux\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-daemon-config\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160937 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.160976 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/3d0ab572-848b-495c-afdf-ad744ea2b230-kube-api-access-zwgwn\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161012 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-kubelet\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-kubelet\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-systemd-units\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-systemd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161179 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-schp7\" (UniqueName: \"kubernetes.io/projected/467df738-bd26-4dba-b771-01c7f6844b70-kube-api-access-schp7\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161198 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-systemd-units\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-systemd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161212 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-cni-binary-copy\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161259 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf7b2\" (UniqueName: \"kubernetes.io/projected/edb2f95f-5ada-45b4-862b-187eab79d4a5-kube-api-access-hf7b2\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.161367 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161288 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-registration-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2febdb01-c922-4ac4-81e6-2b92df909f85-host-slash\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161338 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-ovn\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161351 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-registration-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161362 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-env-overrides\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-run-ovn\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161409 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-socket-dir-parent\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2febdb01-c922-4ac4-81e6-2b92df909f85-host-slash\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161445 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-hostroot\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161469 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f53d27c-7378-4e8b-8dfd-f39beb70f859-konnectivity-ca\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-os-release\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161540 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-var-lib-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-node-log\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161629 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-k8s-cni-cncf-io\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161652 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-var-lib-openvswitch\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s928p\" (UniqueName: \"kubernetes.io/projected/5596cb4f-7692-4c74-82c7-87e46bdfd720-kube-api-access-s928p\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161687 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.162200 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161689 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-node-log\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161715 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-config\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161751 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-system-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161803 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-tuned\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161827 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa030d80-2a63-4669-acb7-9485b1b8ce4a-tmp-dir\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161854 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/567139c7-8d34-429b-bd38-0ab1aafa14e9-host\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161860 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-env-overrides\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7khdc\" (UniqueName: \"kubernetes.io/projected/567139c7-8d34-429b-bd38-0ab1aafa14e9-kube-api-access-7khdc\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161906 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-slash\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161922 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-kubelet-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-bin\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/567139c7-8d34-429b-bd38-0ab1aafa14e9-host\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161969 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-cnibin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-bin\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.161984 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-slash\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162009 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-netns\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.162873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-kubelet\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162059 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-binary-copy\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162085 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-log-socket\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162122 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-netd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162149 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162153 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-log-socket\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-config\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162215 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-kubernetes\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162193 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-cni-netd\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162252 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-sys\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162283 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162302 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-socket-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162324 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-netns\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-script-lib\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.162348 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162359 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.163484 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162387 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clmp5\" (UniqueName: \"kubernetes.io/projected/c0077ff0-36ff-4fe1-bc19-c63239f74a39-kube-api-access-clmp5\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/467df738-bd26-4dba-b771-01c7f6844b70-host-run-netns\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.162419 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:21.6623982 +0000 UTC m=+3.023940949 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162436 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa030d80-2a63-4669-acb7-9485b1b8ce4a-hosts-file\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162462 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-etc-kubernetes\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162486 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-system-cni-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162508 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-host\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162510 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0a6ebd77-c55d-495e-acdf-6afc122a2621-socket-dir\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.164041 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.162843 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/467df738-bd26-4dba-b771-01c7f6844b70-ovnkube-script-lib\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.164551 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.164514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/467df738-bd26-4dba-b771-01c7f6844b70-ovn-node-metrics-cert\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.168607 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.168560 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:21.168607 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.168600 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:21.168607 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.168613 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:21.168851 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.168676 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:21.668657772 +0000 UTC m=+3.030200539 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:21.168919 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.168858 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfd2w\" (UniqueName: \"kubernetes.io/projected/2febdb01-c922-4ac4-81e6-2b92df909f85-kube-api-access-pfd2w\") pod \"iptables-alerter-zzzpm\" (UID: \"2febdb01-c922-4ac4-81e6-2b92df909f85\") " pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.171103 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.171079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7khdc\" (UniqueName: \"kubernetes.io/projected/567139c7-8d34-429b-bd38-0ab1aafa14e9-kube-api-access-7khdc\") pod \"node-ca-xzrpt\" (UID: \"567139c7-8d34-429b-bd38-0ab1aafa14e9\") " pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.171328 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.171289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t97pk\" (UniqueName: \"kubernetes.io/projected/0a6ebd77-c55d-495e-acdf-6afc122a2621-kube-api-access-t97pk\") pod \"aws-ebs-csi-driver-node-r9zqs\" (UID: \"0a6ebd77-c55d-495e-acdf-6afc122a2621\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.172002 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.171977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwgwn\" (UniqueName: \"kubernetes.io/projected/3d0ab572-848b-495c-afdf-ad744ea2b230-kube-api-access-zwgwn\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.172880 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.172861 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-schp7\" (UniqueName: \"kubernetes.io/projected/467df738-bd26-4dba-b771-01c7f6844b70-kube-api-access-schp7\") pod \"ovnkube-node-l6b5l\" (UID: \"467df738-bd26-4dba-b771-01c7f6844b70\") " pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.187235 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.187185 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" event={"ID":"2bd4e5594979c64677900b97136c3521","Type":"ContainerStarted","Data":"dc042a111114febd71c640790016c37d9bb4bf22a12ce57fb538b40c6256dde8"} Apr 16 13:59:21.188236 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.188210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" event={"ID":"c5729c9ff098ff2004acf3ccde20c30f","Type":"ContainerStarted","Data":"a2349cd9aa4bdbdc03d74e10db76d7b3a0834ed4ec36e0738f5df9dabc7e97e6"} Apr 16 13:59:21.263066 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rf7z\" (UniqueName: \"kubernetes.io/projected/aa030d80-2a63-4669-acb7-9485b1b8ce4a-kube-api-access-2rf7z\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.263066 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263074 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-multus\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-cnibin\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysconfig\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263149 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-systemd\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263174 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-var-lib-kubelet\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263185 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-multus\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263202 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-multus-certs\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263311 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263289 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-systemd\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263337 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-multus-certs\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-conf\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-cnibin\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f53d27c-7378-4e8b-8dfd-f39beb70f859-agent-certs\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-tmp\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263459 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysconfig\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-os-release\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263547 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-var-lib-kubelet\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-os-release\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263556 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-conf\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263592 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-conf-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.263635 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-modprobe-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263646 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-run\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263667 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-lib-modules\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263621 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-conf-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263692 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-bin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263721 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-cni-bin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-daemon-config\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263751 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-cni-binary-copy\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263804 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hf7b2\" (UniqueName: \"kubernetes.io/projected/edb2f95f-5ada-45b4-862b-187eab79d4a5-kube-api-access-hf7b2\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263863 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-socket-dir-parent\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263895 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-hostroot\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263925 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f53d27c-7378-4e8b-8dfd-f39beb70f859-konnectivity-ca\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263951 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-os-release\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263963 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-sysctl-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.263982 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-k8s-cni-cncf-io\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s928p\" (UniqueName: \"kubernetes.io/projected/5596cb4f-7692-4c74-82c7-87e46bdfd720-kube-api-access-s928p\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-run\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264121 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264048 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-modprobe-d\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264077 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-hostroot\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-system-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-system-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264139 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-lib-modules\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-tuned\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264190 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa030d80-2a63-4669-acb7-9485b1b8ce4a-tmp-dir\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264220 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-cnibin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-netns\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264252 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-daemon-config\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-kubelet\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-binary-copy\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264325 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264333 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-k8s-cni-cncf-io\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264351 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-kubernetes\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264385 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-os-release\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.264961 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-sys\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/edb2f95f-5ada-45b4-862b-187eab79d4a5-cni-binary-copy\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264473 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-cnibin\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264483 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-cni-dir\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264474 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clmp5\" (UniqueName: \"kubernetes.io/projected/c0077ff0-36ff-4fe1-bc19-c63239f74a39-kube-api-access-clmp5\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa030d80-2a63-4669-acb7-9485b1b8ce4a-hosts-file\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-var-lib-kubelet\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264542 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-sys\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264545 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-etc-kubernetes\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264575 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-system-cni-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-host\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/0f53d27c-7378-4e8b-8dfd-f39beb70f859-konnectivity-ca\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264422 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-multus-socket-dir-parent\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264685 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-host\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264703 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-kubernetes\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264722 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-system-cni-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264597 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-etc-kubernetes\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.265809 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.264424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/edb2f95f-5ada-45b4-862b-187eab79d4a5-host-run-netns\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa030d80-2a63-4669-acb7-9485b1b8ce4a-tmp-dir\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265128 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-binary-copy\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/aa030d80-2a63-4669-acb7-9485b1b8ce4a-hosts-file\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265188 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265254 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5596cb4f-7692-4c74-82c7-87e46bdfd720-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.265653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5596cb4f-7692-4c74-82c7-87e46bdfd720-tuning-conf-dir\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.266695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.266411 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-tmp\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.267053 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.266752 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/c0077ff0-36ff-4fe1-bc19-c63239f74a39-etc-tuned\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.267179 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.267160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/0f53d27c-7378-4e8b-8dfd-f39beb70f859-agent-certs\") pod \"konnectivity-agent-l76kn\" (UID: \"0f53d27c-7378-4e8b-8dfd-f39beb70f859\") " pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.273453 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.273420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s928p\" (UniqueName: \"kubernetes.io/projected/5596cb4f-7692-4c74-82c7-87e46bdfd720-kube-api-access-s928p\") pod \"multus-additional-cni-plugins-c2sdt\" (UID: \"5596cb4f-7692-4c74-82c7-87e46bdfd720\") " pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.273653 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.273631 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clmp5\" (UniqueName: \"kubernetes.io/projected/c0077ff0-36ff-4fe1-bc19-c63239f74a39-kube-api-access-clmp5\") pod \"tuned-sfmlv\" (UID: \"c0077ff0-36ff-4fe1-bc19-c63239f74a39\") " pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.273877 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.273854 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rf7z\" (UniqueName: \"kubernetes.io/projected/aa030d80-2a63-4669-acb7-9485b1b8ce4a-kube-api-access-2rf7z\") pod \"node-resolver-65gv2\" (UID: \"aa030d80-2a63-4669-acb7-9485b1b8ce4a\") " pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.274102 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.274082 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf7b2\" (UniqueName: \"kubernetes.io/projected/edb2f95f-5ada-45b4-862b-187eab79d4a5-kube-api-access-hf7b2\") pod \"multus-44586\" (UID: \"edb2f95f-5ada-45b4-862b-187eab79d4a5\") " pod="openshift-multus/multus-44586" Apr 16 13:59:21.352073 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.352028 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" Apr 16 13:59:21.359892 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.359864 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zzzpm" Apr 16 13:59:21.373566 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.373543 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:21.379212 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.379167 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzrpt" Apr 16 13:59:21.381432 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.381410 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.385853 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.385826 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:21.386094 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.386079 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.394745 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.394723 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" Apr 16 13:59:21.401360 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.401337 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" Apr 16 13:59:21.407942 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.407925 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-65gv2" Apr 16 13:59:21.413515 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.413498 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-44586" Apr 16 13:59:21.534911 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.534885 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod567139c7_8d34_429b_bd38_0ab1aafa14e9.slice/crio-6435df5940298bddcc5739b1a1c7afa58e462c7bd753497d6a6e953147424d9e WatchSource:0}: Error finding container 6435df5940298bddcc5739b1a1c7afa58e462c7bd753497d6a6e953147424d9e: Status 404 returned error can't find the container with id 6435df5940298bddcc5739b1a1c7afa58e462c7bd753497d6a6e953147424d9e Apr 16 13:59:21.536044 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.535928 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a6ebd77_c55d_495e_acdf_6afc122a2621.slice/crio-a250b8a2e28609c4b16de3cbc9bf131561c686ddc201c80e60d548c838b8dc2b WatchSource:0}: Error finding container a250b8a2e28609c4b16de3cbc9bf131561c686ddc201c80e60d548c838b8dc2b: Status 404 returned error can't find the container with id a250b8a2e28609c4b16de3cbc9bf131561c686ddc201c80e60d548c838b8dc2b Apr 16 13:59:21.538126 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.538097 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5596cb4f_7692_4c74_82c7_87e46bdfd720.slice/crio-ee4a8d4b06bfc695f3b69744fe2772f49cc4280912cd7782ca4b989a3bd8cec3 WatchSource:0}: Error finding container ee4a8d4b06bfc695f3b69744fe2772f49cc4280912cd7782ca4b989a3bd8cec3: Status 404 returned error can't find the container with id ee4a8d4b06bfc695f3b69744fe2772f49cc4280912cd7782ca4b989a3bd8cec3 Apr 16 13:59:21.540146 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.540122 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f53d27c_7378_4e8b_8dfd_f39beb70f859.slice/crio-843b36571efe1fe2e7cd677ecb9c360ec9e068e032bbaf373bfb81fed8b2fb16 WatchSource:0}: Error finding container 843b36571efe1fe2e7cd677ecb9c360ec9e068e032bbaf373bfb81fed8b2fb16: Status 404 returned error can't find the container with id 843b36571efe1fe2e7cd677ecb9c360ec9e068e032bbaf373bfb81fed8b2fb16 Apr 16 13:59:21.541081 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.541057 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod467df738_bd26_4dba_b771_01c7f6844b70.slice/crio-fc5cd986ac6285bb44adc2511a734baa6a4d25c8d212942894fa5bf5649ae0b9 WatchSource:0}: Error finding container fc5cd986ac6285bb44adc2511a734baa6a4d25c8d212942894fa5bf5649ae0b9: Status 404 returned error can't find the container with id fc5cd986ac6285bb44adc2511a734baa6a4d25c8d212942894fa5bf5649ae0b9 Apr 16 13:59:21.542740 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.542668 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedb2f95f_5ada_45b4_862b_187eab79d4a5.slice/crio-3e06ddb11753e4cf3a428d8311ffc1ce916733dc4a03ee4e12a7b875d2f024b8 WatchSource:0}: Error finding container 3e06ddb11753e4cf3a428d8311ffc1ce916733dc4a03ee4e12a7b875d2f024b8: Status 404 returned error can't find the container with id 3e06ddb11753e4cf3a428d8311ffc1ce916733dc4a03ee4e12a7b875d2f024b8 Apr 16 13:59:21.544016 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.543982 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0077ff0_36ff_4fe1_bc19_c63239f74a39.slice/crio-3e1c3634eb50754fea6f263cca79eeda092ed318a04ac14b5fe81110ef44ce25 WatchSource:0}: Error finding container 3e1c3634eb50754fea6f263cca79eeda092ed318a04ac14b5fe81110ef44ce25: Status 404 returned error can't find the container with id 3e1c3634eb50754fea6f263cca79eeda092ed318a04ac14b5fe81110ef44ce25 Apr 16 13:59:21.546396 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.545524 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa030d80_2a63_4669_acb7_9485b1b8ce4a.slice/crio-3cdb9a376be4267aaca36e57d19153b0e71fa2b8a99bb0c6669baa12278cc5f9 WatchSource:0}: Error finding container 3cdb9a376be4267aaca36e57d19153b0e71fa2b8a99bb0c6669baa12278cc5f9: Status 404 returned error can't find the container with id 3cdb9a376be4267aaca36e57d19153b0e71fa2b8a99bb0c6669baa12278cc5f9 Apr 16 13:59:21.546396 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:21.546066 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2febdb01_c922_4ac4_81e6_2b92df909f85.slice/crio-bf0e511055a5325de862b7fea7302177690af20574b26a2624fde46bcb0cf38b WatchSource:0}: Error finding container bf0e511055a5325de862b7fea7302177690af20574b26a2624fde46bcb0cf38b: Status 404 returned error can't find the container with id bf0e511055a5325de862b7fea7302177690af20574b26a2624fde46bcb0cf38b Apr 16 13:59:21.667019 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.666942 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:21.667151 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.667084 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:21.667188 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.667152 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.667133521 +0000 UTC m=+4.028676268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:21.768044 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:21.768008 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:21.768207 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.768192 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:21.768251 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.768212 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:21.768251 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.768222 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:21.768311 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:21.768280 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.768265933 +0000 UTC m=+4.129808687 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.101039 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.100837 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:20 +0000 UTC" deadline="2027-09-11 22:42:14.765819473 +0000 UTC" Apr 16 13:59:22.101567 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.101042 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12320h42m52.664783875s" Apr 16 13:59:22.194331 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.194266 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44586" event={"ID":"edb2f95f-5ada-45b4-862b-187eab79d4a5","Type":"ContainerStarted","Data":"3e06ddb11753e4cf3a428d8311ffc1ce916733dc4a03ee4e12a7b875d2f024b8"} Apr 16 13:59:22.196032 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.195968 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"fc5cd986ac6285bb44adc2511a734baa6a4d25c8d212942894fa5bf5649ae0b9"} Apr 16 13:59:22.197406 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.197345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l76kn" event={"ID":"0f53d27c-7378-4e8b-8dfd-f39beb70f859","Type":"ContainerStarted","Data":"843b36571efe1fe2e7cd677ecb9c360ec9e068e032bbaf373bfb81fed8b2fb16"} Apr 16 13:59:22.199889 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.199831 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerStarted","Data":"ee4a8d4b06bfc695f3b69744fe2772f49cc4280912cd7782ca4b989a3bd8cec3"} Apr 16 13:59:22.201323 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.201259 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzrpt" event={"ID":"567139c7-8d34-429b-bd38-0ab1aafa14e9","Type":"ContainerStarted","Data":"6435df5940298bddcc5739b1a1c7afa58e462c7bd753497d6a6e953147424d9e"} Apr 16 13:59:22.202442 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.202385 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zzzpm" event={"ID":"2febdb01-c922-4ac4-81e6-2b92df909f85","Type":"ContainerStarted","Data":"bf0e511055a5325de862b7fea7302177690af20574b26a2624fde46bcb0cf38b"} Apr 16 13:59:22.205438 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.205415 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" event={"ID":"0a6ebd77-c55d-495e-acdf-6afc122a2621","Type":"ContainerStarted","Data":"a250b8a2e28609c4b16de3cbc9bf131561c686ddc201c80e60d548c838b8dc2b"} Apr 16 13:59:22.207658 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.207637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" event={"ID":"2bd4e5594979c64677900b97136c3521","Type":"ContainerStarted","Data":"3bbaf61c0d39f3bd4c250a2178d78cabc30f93c3ad70d8f58bbfe964630e7e0e"} Apr 16 13:59:22.209786 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.209762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-65gv2" event={"ID":"aa030d80-2a63-4669-acb7-9485b1b8ce4a","Type":"ContainerStarted","Data":"3cdb9a376be4267aaca36e57d19153b0e71fa2b8a99bb0c6669baa12278cc5f9"} Apr 16 13:59:22.220801 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.220720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" event={"ID":"c0077ff0-36ff-4fe1-bc19-c63239f74a39","Type":"ContainerStarted","Data":"3e1c3634eb50754fea6f263cca79eeda092ed318a04ac14b5fe81110ef44ce25"} Apr 16 13:59:22.677520 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.677480 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:22.677729 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.677645 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.677729 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.677714 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.677694479 +0000 UTC m=+6.039237271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.778698 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:22.778049 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:22.778698 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.778247 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:22.778698 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.778266 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:22.778698 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.778280 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.778698 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:22.778338 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.778319408 +0000 UTC m=+6.139862177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.184690 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.182788 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:23.184690 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:23.182919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:23.184690 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.183346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:23.184690 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:23.183441 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:23.227059 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.226291 2574 generic.go:358] "Generic (PLEG): container finished" podID="2bd4e5594979c64677900b97136c3521" containerID="3bbaf61c0d39f3bd4c250a2178d78cabc30f93c3ad70d8f58bbfe964630e7e0e" exitCode=0 Apr 16 13:59:23.227059 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.226342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" event={"ID":"2bd4e5594979c64677900b97136c3521","Type":"ContainerDied","Data":"3bbaf61c0d39f3bd4c250a2178d78cabc30f93c3ad70d8f58bbfe964630e7e0e"} Apr 16 13:59:23.526652 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.526165 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-snqxc"] Apr 16 13:59:23.531679 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.531130 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.531679 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:23.531259 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:23.584402 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.584114 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-dbus\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.584402 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.584166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.584402 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.584276 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-kubelet-config\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.685294 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.685253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-kubelet-config\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.685456 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.685318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-dbus\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.685456 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.685343 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.685598 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:23.685542 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.685661 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:23.685649 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.185603557 +0000 UTC m=+5.547146309 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.685737 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.685724 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-kubelet-config\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:23.685885 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:23.685865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-dbus\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:24.189053 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:24.188488 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:24.189053 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.188644 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.189053 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.188704 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.188685864 +0000 UTC m=+6.550228613 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:24.694439 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:24.694401 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:24.694633 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.694600 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.694703 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.694662 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.694643258 +0000 UTC m=+10.056186019 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.795893 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:24.795830 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:24.796086 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.796028 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:24.796086 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.796048 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:24.796086 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.796061 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.796240 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:24.796120 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:28.796100717 +0000 UTC m=+10.157643470 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:25.183527 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:25.183493 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:25.183724 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:25.183644 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:25.184042 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:25.184022 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:25.184136 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:25.184118 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:25.184190 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:25.184181 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:25.184276 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:25.184260 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:25.199914 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:25.199860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:25.200346 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:25.200003 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.200346 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:25.200060 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:27.200043903 +0000 UTC m=+8.561586669 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:27.183541 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:27.183508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:27.184024 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:27.183658 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:27.184024 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:27.183718 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:27.184024 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:27.183803 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:27.184024 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:27.183841 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:27.184024 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:27.183899 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:27.222444 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:27.221935 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:27.222444 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:27.222087 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:27.222444 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:27.222143 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:31.222123931 +0000 UTC m=+12.583666680 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:28.733993 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:28.733956 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:28.734436 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.734123 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.734436 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.734195 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.734176048 +0000 UTC m=+18.095718794 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:28.834807 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:28.834709 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:28.834995 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.834873 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:28.834995 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.834892 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:28.834995 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.834903 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:28.834995 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:28.834959 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.834941918 +0000 UTC m=+18.196484678 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:29.184016 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:29.184126 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:29.184506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:29.184625 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:29.185423 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:29.185557 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:29.185523 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:31.183085 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:31.182980 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:31.183085 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:31.183010 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:31.183085 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:31.183042 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:31.183633 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:31.183131 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:31.183633 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:31.183214 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:31.183633 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:31.183244 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:31.252763 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:31.252723 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:31.252921 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:31.252874 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:31.252996 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:31.252951 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.252931713 +0000 UTC m=+20.614474464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:33.182936 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:33.182899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:33.183380 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:33.182899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:33.183380 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:33.183013 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:33.183380 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:33.182899 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:33.183380 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:33.183117 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:33.183380 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:33.183192 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:35.183774 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:35.183730 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:35.183774 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:35.183771 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:35.184250 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:35.183845 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:35.184250 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:35.183854 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:35.184250 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:35.183931 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:35.184250 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:35.184028 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:36.798554 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:36.798512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:36.798987 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.798679 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.798987 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.798755 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:52.798736331 +0000 UTC m=+34.160279081 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.899518 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:36.899483 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:36.899725 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.899675 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:36.899725 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.899702 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:36.899725 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.899717 2574 projected.go:194] Error preparing data for projected volume kube-api-access-hj8bl for pod openshift-network-diagnostics/network-check-target-vl5lw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:36.899881 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:36.899776 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl podName:5b2a2f05-9a25-4652-ba72-816977b324b5 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:52.899762407 +0000 UTC m=+34.261305170 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-hj8bl" (UniqueName: "kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl") pod "network-check-target-vl5lw" (UID: "5b2a2f05-9a25-4652-ba72-816977b324b5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:37.183543 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:37.183505 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:37.183543 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:37.183524 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:37.183783 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:37.183505 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:37.183783 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:37.183628 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:37.183783 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:37.183696 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:37.183783 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:37.183764 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:39.184344 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.184176 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:39.184865 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.184252 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:39.184865 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:39.184434 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:39.184865 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.184270 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:39.184865 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:39.184511 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:39.184865 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:39.184572 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:39.251661 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.251630 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" event={"ID":"c0077ff0-36ff-4fe1-bc19-c63239f74a39","Type":"ContainerStarted","Data":"6384ecdbae137489cd8725136ade2565f0bd72d0163e78229bdac77ad32f197e"} Apr 16 13:59:39.252795 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.252770 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-44586" event={"ID":"edb2f95f-5ada-45b4-862b-187eab79d4a5","Type":"ContainerStarted","Data":"c3c5d11f04e4fcf4ff8efbc8149ceab593119cb672fc9f0754141e63699a0e63"} Apr 16 13:59:39.254149 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.254117 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"ba60185dc2d8326c6e3c28f650a4f5ad563b89237ac6d86296bed5a68e9ff01b"} Apr 16 13:59:39.255784 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.255762 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-l76kn" event={"ID":"0f53d27c-7378-4e8b-8dfd-f39beb70f859","Type":"ContainerStarted","Data":"23424be491963a46890249eb1a2a005bdc61d28365b7607725cbf1be46df2242"} Apr 16 13:59:39.256905 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.256884 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerStarted","Data":"9be67a1d6cf8bb3dad74cd7b8d8f792e0302d5a7ee7fc9315322f773c1d7963b"} Apr 16 13:59:39.257863 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.257844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzrpt" event={"ID":"567139c7-8d34-429b-bd38-0ab1aafa14e9","Type":"ContainerStarted","Data":"a3cba6a2b884cabb10538ec46df7dcf431365c533dd98eec749122e4b4582583"} Apr 16 13:59:39.258781 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.258752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" event={"ID":"c5729c9ff098ff2004acf3ccde20c30f","Type":"ContainerStarted","Data":"a199e8ab44776de78e3f6deb40ab0410329371b6340cc35e5c6af29aca7c37a6"} Apr 16 13:59:39.259693 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.259665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" event={"ID":"0a6ebd77-c55d-495e-acdf-6afc122a2621","Type":"ContainerStarted","Data":"c391752a05871d3a3c882025cbb9e15f3bf75c9f6cda4608af09ffbaf1d30153"} Apr 16 13:59:39.261013 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.260994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" event={"ID":"2bd4e5594979c64677900b97136c3521","Type":"ContainerStarted","Data":"20247ad270b3dbec89e4d2edffbe8496f75cdbeddc7e1b9dc2c4cfc1de05f5e8"} Apr 16 13:59:39.262116 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.262100 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-65gv2" event={"ID":"aa030d80-2a63-4669-acb7-9485b1b8ce4a","Type":"ContainerStarted","Data":"ce6a15fcf0ef8cd9497792cd9ca77f75f6a7061500117e9ae6f8c8a4bb749592"} Apr 16 13:59:39.270765 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.270730 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-sfmlv" podStartSLOduration=3.113564439 podStartE2EDuration="20.270720353s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.569872764 +0000 UTC m=+2.931415511" lastFinishedPulling="2026-04-16 13:59:38.727028664 +0000 UTC m=+20.088571425" observedRunningTime="2026-04-16 13:59:39.270456805 +0000 UTC m=+20.631999574" watchObservedRunningTime="2026-04-16 13:59:39.270720353 +0000 UTC m=+20.632263120" Apr 16 13:59:39.283815 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.283767 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-139-151.ec2.internal" podStartSLOduration=19.283750065 podStartE2EDuration="19.283750065s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:39.283642455 +0000 UTC m=+20.645185227" watchObservedRunningTime="2026-04-16 13:59:39.283750065 +0000 UTC m=+20.645292836" Apr 16 13:59:39.297374 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.297333 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xzrpt" podStartSLOduration=3.205777238 podStartE2EDuration="20.29732042s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.536934378 +0000 UTC m=+2.898477139" lastFinishedPulling="2026-04-16 13:59:38.628477575 +0000 UTC m=+19.990020321" observedRunningTime="2026-04-16 13:59:39.297159122 +0000 UTC m=+20.658701890" watchObservedRunningTime="2026-04-16 13:59:39.29732042 +0000 UTC m=+20.658863209" Apr 16 13:59:39.311347 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.311306 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-l76kn" podStartSLOduration=3.153536561 podStartE2EDuration="20.311293041s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.542358855 +0000 UTC m=+2.903901602" lastFinishedPulling="2026-04-16 13:59:38.700115332 +0000 UTC m=+20.061658082" observedRunningTime="2026-04-16 13:59:39.311072581 +0000 UTC m=+20.672615349" watchObservedRunningTime="2026-04-16 13:59:39.311293041 +0000 UTC m=+20.672835805" Apr 16 13:59:39.317394 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.317347 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:39.317536 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:39.317416 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.317536 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:39.317463 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret podName:2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.31744764 +0000 UTC m=+36.678990397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret") pod "global-pull-secret-syncer-snqxc" (UID: "2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:39.352364 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.352313 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-65gv2" podStartSLOduration=3.195169202 podStartE2EDuration="20.35229819s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.57000325 +0000 UTC m=+2.931546004" lastFinishedPulling="2026-04-16 13:59:38.727132239 +0000 UTC m=+20.088674992" observedRunningTime="2026-04-16 13:59:39.352077448 +0000 UTC m=+20.713620220" watchObservedRunningTime="2026-04-16 13:59:39.35229819 +0000 UTC m=+20.713840959" Apr 16 13:59:39.375852 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.375807 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-44586" podStartSLOduration=3.191700364 podStartE2EDuration="20.375792739s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.545263487 +0000 UTC m=+2.906806234" lastFinishedPulling="2026-04-16 13:59:38.72935585 +0000 UTC m=+20.090898609" observedRunningTime="2026-04-16 13:59:39.375508943 +0000 UTC m=+20.737051711" watchObservedRunningTime="2026-04-16 13:59:39.375792739 +0000 UTC m=+20.737335506" Apr 16 13:59:39.395792 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.395748 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-139-151.ec2.internal" podStartSLOduration=19.395731747 podStartE2EDuration="19.395731747s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:39.395403647 +0000 UTC m=+20.756946415" watchObservedRunningTime="2026-04-16 13:59:39.395731747 +0000 UTC m=+20.757274514" Apr 16 13:59:39.769593 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.769335 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:39.769963 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:39.769945 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:40.266339 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266317 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266595 2574 generic.go:358] "Generic (PLEG): container finished" podID="467df738-bd26-4dba-b771-01c7f6844b70" containerID="03594ac649fe9edd51930210783b3b6de4c811a65db7cb0a406873b18c809894" exitCode=1 Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"a28099ec99a48d6be58302f0b44e47f6c46744a91acf4b52c5ac7cdf57e8ac50"} Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266681 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"f58f8334694a4608389c6737c54549633ce1492abc442f4f750ce453306b3715"} Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266691 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"5b1d83b9b408c70c29db9b3f464e45d47861ac4fc54f216130605b6c2cb20ea1"} Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266699 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"72490730f312e1187f09923ae6f1cfaa5321bcd61cfcf52e93428e4efa372ea4"} Apr 16 13:59:40.267145 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.266707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerDied","Data":"03594ac649fe9edd51930210783b3b6de4c811a65db7cb0a406873b18c809894"} Apr 16 13:59:40.267873 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.267855 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="9be67a1d6cf8bb3dad74cd7b8d8f792e0302d5a7ee7fc9315322f773c1d7963b" exitCode=0 Apr 16 13:59:40.268035 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.268006 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"9be67a1d6cf8bb3dad74cd7b8d8f792e0302d5a7ee7fc9315322f773c1d7963b"} Apr 16 13:59:40.268461 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.268445 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:40.268683 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.268657 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-l76kn" Apr 16 13:59:40.726865 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:40.726842 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:41.129952 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.129852 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:40.726862846Z","UUID":"579b2868-1bd5-4807-85fb-e50cdf885fbe","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:41.133059 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.133034 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:41.133191 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.133067 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:41.183379 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.183341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:41.183379 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.183368 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:41.183379 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.183386 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:41.183673 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:41.183489 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:41.183673 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:41.183635 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:41.183816 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:41.183785 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:41.271960 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.271928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zzzpm" event={"ID":"2febdb01-c922-4ac4-81e6-2b92df909f85","Type":"ContainerStarted","Data":"ae002300a510e71e36d3b574021fb644cc6ee4b2c0632d506bf455bd134b4958"} Apr 16 13:59:41.273784 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.273759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" event={"ID":"0a6ebd77-c55d-495e-acdf-6afc122a2621","Type":"ContainerStarted","Data":"bbc62c8443e5aafbb689ea6ace84ecd2beab6c81fcc9a51403ecf5ee6a018df7"} Apr 16 13:59:41.289054 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:41.289009 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zzzpm" podStartSLOduration=5.120957238 podStartE2EDuration="22.288994693s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.569984695 +0000 UTC m=+2.931527460" lastFinishedPulling="2026-04-16 13:59:38.738022156 +0000 UTC m=+20.099564915" observedRunningTime="2026-04-16 13:59:41.288605052 +0000 UTC m=+22.650147821" watchObservedRunningTime="2026-04-16 13:59:41.288994693 +0000 UTC m=+22.650537462" Apr 16 13:59:42.279164 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:42.278979 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 13:59:42.279686 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:42.279649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"5a58f48d69b36d73816938f81101cfce404f1a194d182b9631a002ecfb421d4e"} Apr 16 13:59:43.183058 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:43.183023 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:43.183237 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:43.183145 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:43.183237 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:43.183151 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:43.183353 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:43.183253 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:43.183353 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:43.183248 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:43.183353 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:43.183341 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:43.283860 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:43.283764 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" event={"ID":"0a6ebd77-c55d-495e-acdf-6afc122a2621","Type":"ContainerStarted","Data":"b2b9380e282f120e2d2fd0d17ff29b62faa00456d5d80b14886933ca252599b6"} Apr 16 13:59:43.301770 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:43.301722 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-r9zqs" podStartSLOduration=3.7202049969999997 podStartE2EDuration="24.301706922s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.538427894 +0000 UTC m=+2.899970640" lastFinishedPulling="2026-04-16 13:59:42.119929804 +0000 UTC m=+23.481472565" observedRunningTime="2026-04-16 13:59:43.30156072 +0000 UTC m=+24.663103502" watchObservedRunningTime="2026-04-16 13:59:43.301706922 +0000 UTC m=+24.663249690" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.183760 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:45.184263 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.183936 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.183889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:45.184541 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:45.186640 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:45.185636 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:45.289815 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.289788 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 13:59:45.290112 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.290092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"67ec383ee017e30c1a9a12d1ed0907a58c720bf3f734ad632a80d0c76d6e7395"} Apr 16 13:59:45.290413 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.290379 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:45.290551 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.290536 2574 scope.go:117] "RemoveContainer" containerID="03594ac649fe9edd51930210783b3b6de4c811a65db7cb0a406873b18c809894" Apr 16 13:59:45.291628 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.291607 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="eccb676c3a49bae668eade8ab2053bdf83d3b55ecf1410411ffa60491d593810" exitCode=0 Apr 16 13:59:45.291673 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.291647 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"eccb676c3a49bae668eade8ab2053bdf83d3b55ecf1410411ffa60491d593810"} Apr 16 13:59:45.306006 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:45.305983 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:46.222835 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.222798 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:46.296326 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.296305 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 13:59:46.296664 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.296644 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" event={"ID":"467df738-bd26-4dba-b771-01c7f6844b70","Type":"ContainerStarted","Data":"cd95355c4335a9e9e1f09aea9f12a31435b24db9aa499c34905af8a5b4f4d634"} Apr 16 13:59:46.296865 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.296841 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:46.311140 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.311118 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 13:59:46.323757 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.323071 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" podStartSLOduration=9.802320057 podStartE2EDuration="27.323053453s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.543804976 +0000 UTC m=+2.905347724" lastFinishedPulling="2026-04-16 13:59:39.06453836 +0000 UTC m=+20.426081120" observedRunningTime="2026-04-16 13:59:46.321333952 +0000 UTC m=+27.682876733" watchObservedRunningTime="2026-04-16 13:59:46.323053453 +0000 UTC m=+27.684596226" Apr 16 13:59:46.486760 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.486676 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbnhb"] Apr 16 13:59:46.486896 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.486820 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:46.486940 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:46.486919 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:46.487408 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.487387 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vl5lw"] Apr 16 13:59:46.487494 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.487488 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:46.487590 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:46.487562 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:46.500644 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.500614 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-snqxc"] Apr 16 13:59:46.500782 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:46.500742 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:46.500843 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:46.500832 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:47.300689 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:47.300413 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="6eca541bcaf00bc189204a9b1244101833ec8a7fb274b700ac79017d9c360009" exitCode=0 Apr 16 13:59:47.301072 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:47.300492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"6eca541bcaf00bc189204a9b1244101833ec8a7fb274b700ac79017d9c360009"} Apr 16 13:59:48.183415 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:48.183385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:48.183575 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:48.183385 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:48.183575 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:48.183510 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:48.183575 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:48.183557 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:48.183575 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:48.183387 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:48.183853 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:48.183651 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:48.304675 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:48.304574 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="286fd50ba391a85ede328da16624d4ee802a81478fe04fe1c749541cf38043ed" exitCode=0 Apr 16 13:59:48.305089 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:48.304685 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"286fd50ba391a85ede328da16624d4ee802a81478fe04fe1c749541cf38043ed"} Apr 16 13:59:50.182998 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:50.182964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:50.183653 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:50.182964 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:50.183653 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:50.183078 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-snqxc" podUID="2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9" Apr 16 13:59:50.183653 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:50.182976 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:50.183653 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:50.183164 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fbnhb" podUID="3d0ab572-848b-495c-afdf-ad744ea2b230" Apr 16 13:59:50.183653 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:50.183245 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-vl5lw" podUID="5b2a2f05-9a25-4652-ba72-816977b324b5" Apr 16 13:59:52.002932 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.002855 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-139-151.ec2.internal" event="NodeReady" Apr 16 13:59:52.003637 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.003018 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:52.049067 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.049034 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-84w8w"] Apr 16 13:59:52.051924 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.051901 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.054455 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.054390 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2bcnf"] Apr 16 13:59:52.056561 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.056541 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 13:59:52.056691 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.056606 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:52.057167 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.057147 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.060385 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.060365 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:52.060695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.060670 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:52.060847 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.060829 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:52.061088 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.060899 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 13:59:52.061652 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.061634 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:52.066740 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.066572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-84w8w"] Apr 16 13:59:52.068834 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.068810 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2bcnf"] Apr 16 13:59:52.183190 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.183155 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:52.183190 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.183192 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:52.183428 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.183157 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:52.185497 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:52.185497 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185496 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:52.185736 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185505 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-gs8rq\"" Apr 16 13:59:52.185736 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185630 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:52.185736 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185653 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 13:59:52.185736 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.185726 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:52.219165 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-config-volume\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.219310 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219200 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.219310 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219247 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brqm8\" (UniqueName: \"kubernetes.io/projected/e05dc6e0-3aae-437d-a7bd-6b5851441185-kube-api-access-brqm8\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.219310 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219275 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpmz\" (UniqueName: \"kubernetes.io/projected/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-kube-api-access-pfpmz\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.219310 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219303 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.219477 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.219362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-tmp-dir\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320364 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320285 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-tmp-dir\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320364 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320334 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-config-volume\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320617 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320366 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320617 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-brqm8\" (UniqueName: \"kubernetes.io/projected/e05dc6e0-3aae-437d-a7bd-6b5851441185-kube-api-access-brqm8\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.320617 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpmz\" (UniqueName: \"kubernetes.io/projected/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-kube-api-access-pfpmz\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320617 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.320524 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:52.320617 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.320604 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:52.820565865 +0000 UTC m=+34.182108624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 13:59:52.320854 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320625 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.320854 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.320703 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:52.320854 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.320727 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-tmp-dir\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.320854 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.320743 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:52.820730696 +0000 UTC m=+34.182273442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 13:59:52.321048 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.321018 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-config-volume\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.333483 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.333454 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpmz\" (UniqueName: \"kubernetes.io/projected/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-kube-api-access-pfpmz\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.333649 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.333532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-brqm8\" (UniqueName: \"kubernetes.io/projected/e05dc6e0-3aae-437d-a7bd-6b5851441185-kube-api-access-brqm8\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.825137 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.825068 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:52.825137 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.825143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.825168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825232 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825291 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825308 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.825285506 +0000 UTC m=+35.186828275 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825343 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.825331913 +0000 UTC m=+35.186874663 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 13:59:52.825386 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825382 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:52.825649 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:52.825448 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:24.825432131 +0000 UTC m=+66.186974881 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : secret "metrics-daemon-secret" not found Apr 16 13:59:52.925653 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.925611 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:52.928415 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:52.928388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj8bl\" (UniqueName: \"kubernetes.io/projected/5b2a2f05-9a25-4652-ba72-816977b324b5-kube-api-access-hj8bl\") pod \"network-check-target-vl5lw\" (UID: \"5b2a2f05-9a25-4652-ba72-816977b324b5\") " pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:53.095085 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:53.094999 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:53.834251 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:53.834214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:53.834437 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:53.834313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:53.834437 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:53.834367 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:53.834437 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:53.834401 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:53.834604 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:53.834451 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.834430855 +0000 UTC m=+37.195973603 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 13:59:53.834604 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:53.834470 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:55.834461087 +0000 UTC m=+37.196003836 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 13:59:54.589665 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:54.589457 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-vl5lw"] Apr 16 13:59:54.694763 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:54.694689 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2a2f05_9a25_4652_ba72_816977b324b5.slice/crio-123fcf68d80496fcfd5a234ae1d4be8f2ec3a124ac9af6089687196b2f5ff900 WatchSource:0}: Error finding container 123fcf68d80496fcfd5a234ae1d4be8f2ec3a124ac9af6089687196b2f5ff900: Status 404 returned error can't find the container with id 123fcf68d80496fcfd5a234ae1d4be8f2ec3a124ac9af6089687196b2f5ff900 Apr 16 13:59:55.321319 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.321290 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="9989d9ce19ef2bb4cce6c9caa3c0cf81ebce6b64bcafcdd018e8f0d06f10a0f8" exitCode=0 Apr 16 13:59:55.321496 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.321330 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"9989d9ce19ef2bb4cce6c9caa3c0cf81ebce6b64bcafcdd018e8f0d06f10a0f8"} Apr 16 13:59:55.322491 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.322443 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vl5lw" event={"ID":"5b2a2f05-9a25-4652-ba72-816977b324b5","Type":"ContainerStarted","Data":"123fcf68d80496fcfd5a234ae1d4be8f2ec3a124ac9af6089687196b2f5ff900"} Apr 16 13:59:55.345926 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.345890 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:55.350529 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.350503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9-original-pull-secret\") pod \"global-pull-secret-syncer-snqxc\" (UID: \"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9\") " pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:55.507597 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.507558 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-snqxc" Apr 16 13:59:55.651187 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.650987 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-snqxc"] Apr 16 13:59:55.654411 ip-10-0-139-151 kubenswrapper[2574]: W0416 13:59:55.654384 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c1c9f52_2fd5_4d89_83a2_c7f03121a1f9.slice/crio-7031ce88947c4f5d0e9d6c69bee5a30e4f740443dc9c21f4c55acfb1d64aca41 WatchSource:0}: Error finding container 7031ce88947c4f5d0e9d6c69bee5a30e4f740443dc9c21f4c55acfb1d64aca41: Status 404 returned error can't find the container with id 7031ce88947c4f5d0e9d6c69bee5a30e4f740443dc9c21f4c55acfb1d64aca41 Apr 16 13:59:55.850854 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.850759 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:55.851022 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:55.850855 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:55.851022 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:55.850956 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:55.851022 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:55.850975 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:55.851169 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:55.851043 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.851021806 +0000 UTC m=+41.212564574 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 13:59:55.851169 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:55.851065 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:59.851055165 +0000 UTC m=+41.212597937 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 13:59:56.327556 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:56.327519 2574 generic.go:358] "Generic (PLEG): container finished" podID="5596cb4f-7692-4c74-82c7-87e46bdfd720" containerID="c0baf2991cbb9ba24fb83f830c87647bc148559173e99c16ab2be8475f4c6a9a" exitCode=0 Apr 16 13:59:56.327746 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:56.327611 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerDied","Data":"c0baf2991cbb9ba24fb83f830c87647bc148559173e99c16ab2be8475f4c6a9a"} Apr 16 13:59:56.328800 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:56.328770 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-snqxc" event={"ID":"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9","Type":"ContainerStarted","Data":"7031ce88947c4f5d0e9d6c69bee5a30e4f740443dc9c21f4c55acfb1d64aca41"} Apr 16 13:59:57.334408 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:57.334368 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" event={"ID":"5596cb4f-7692-4c74-82c7-87e46bdfd720","Type":"ContainerStarted","Data":"bb99599e7c5810ffe57b88dc64aa788adc206dd77ac30d2edb3b20e7dc522f60"} Apr 16 13:59:57.360192 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:57.360139 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-c2sdt" podStartSLOduration=5.1582257049999996 podStartE2EDuration="38.360119218s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:21.540058825 +0000 UTC m=+2.901601572" lastFinishedPulling="2026-04-16 13:59:54.741952336 +0000 UTC m=+36.103495085" observedRunningTime="2026-04-16 13:59:57.359898056 +0000 UTC m=+38.721440836" watchObservedRunningTime="2026-04-16 13:59:57.360119218 +0000 UTC m=+38.721661999" Apr 16 13:59:58.337454 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:58.337418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-vl5lw" event={"ID":"5b2a2f05-9a25-4652-ba72-816977b324b5","Type":"ContainerStarted","Data":"a4beffd94e8f6245c19e12da7bc9238900c22588ce1fef54dde78e82560ea165"} Apr 16 13:59:58.337908 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:58.337784 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 13:59:58.355695 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:58.355611 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-vl5lw" podStartSLOduration=36.153421404 podStartE2EDuration="39.355591981s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 13:59:54.720199622 +0000 UTC m=+36.081742368" lastFinishedPulling="2026-04-16 13:59:57.922370185 +0000 UTC m=+39.283912945" observedRunningTime="2026-04-16 13:59:58.355491428 +0000 UTC m=+39.717034197" watchObservedRunningTime="2026-04-16 13:59:58.355591981 +0000 UTC m=+39.717134741" Apr 16 13:59:59.877745 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:59.877699 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 13:59:59.878207 ip-10-0-139-151 kubenswrapper[2574]: I0416 13:59:59.877780 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 13:59:59.878207 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:59.877854 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:59.878207 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:59.877922 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:59.878207 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:59.877948 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:07.877927066 +0000 UTC m=+49.239469826 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 13:59:59.878207 ip-10-0-139-151 kubenswrapper[2574]: E0416 13:59:59.877970 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:07.877956486 +0000 UTC m=+49.239499232 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 14:00:02.345999 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:02.345963 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-snqxc" event={"ID":"2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9","Type":"ContainerStarted","Data":"dcd0fb2b314a0709ab2536fcb95129c3ee34d9f61d3e257c04f99a7d32e44bb0"} Apr 16 14:00:02.361432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:02.361386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-snqxc" podStartSLOduration=33.733895755 podStartE2EDuration="39.361372504s" podCreationTimestamp="2026-04-16 13:59:23 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.658236312 +0000 UTC m=+37.019779059" lastFinishedPulling="2026-04-16 14:00:01.285713048 +0000 UTC m=+42.647255808" observedRunningTime="2026-04-16 14:00:02.360743594 +0000 UTC m=+43.722286396" watchObservedRunningTime="2026-04-16 14:00:02.361372504 +0000 UTC m=+43.722915272" Apr 16 14:00:07.925813 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:07.925770 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 14:00:07.925813 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:07.925825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:00:07.926239 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:07.925934 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:07.926239 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:07.926010 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:23.925990364 +0000 UTC m=+65.287533111 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 14:00:07.926239 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:07.926032 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:07.926239 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:07.926085 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:23.926072266 +0000 UTC m=+65.287615023 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 14:00:18.323225 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:18.323196 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-l6b5l" Apr 16 14:00:23.940956 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:23.940917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 14:00:23.941408 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:23.940965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:00:23.941408 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:23.941067 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:23.941408 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:23.941103 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:23.941408 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:23.941136 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:55.941119989 +0000 UTC m=+97.302662735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 14:00:23.941408 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:23.941156 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:55.941143945 +0000 UTC m=+97.302686692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 14:00:24.847085 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:24.847009 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 14:00:24.847261 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:24.847157 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:24.847261 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:24.847236 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs podName:3d0ab572-848b-495c-afdf-ad744ea2b230 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:28.847217735 +0000 UTC m=+130.208760482 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs") pod "network-metrics-daemon-fbnhb" (UID: "3d0ab572-848b-495c-afdf-ad744ea2b230") : secret "metrics-daemon-secret" not found Apr 16 14:00:30.343761 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:30.343729 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-vl5lw" Apr 16 14:00:55.955470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:55.955331 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 14:00:55.955470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:55.955390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:00:55.955470 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:55.955478 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:55.955941 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:55.955480 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:55.955941 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:55.955536 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert podName:e05dc6e0-3aae-437d-a7bd-6b5851441185 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:59.955519796 +0000 UTC m=+161.317062555 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert") pod "ingress-canary-2bcnf" (UID: "e05dc6e0-3aae-437d-a7bd-6b5851441185") : secret "canary-serving-cert" not found Apr 16 14:00:55.955941 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:55.955548 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls podName:defde0b7-6e87-43ef-ad17-0ce3ffc5d902 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:59.955542715 +0000 UTC m=+161.317085461 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls") pod "dns-default-84w8w" (UID: "defde0b7-6e87-43ef-ad17-0ce3ffc5d902") : secret "dns-default-metrics-tls" not found Apr 16 14:00:57.531613 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.531563 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w"] Apr 16 14:00:57.536094 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.536068 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" Apr 16 14:00:57.538308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.538278 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.538308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.538295 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.538469 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.538383 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-f5vdd\"" Apr 16 14:00:57.540646 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.540626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w"] Apr 16 14:00:57.642296 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.642259 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xpgj4"] Apr 16 14:00:57.645088 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.645072 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.648291 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.648271 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:00:57.648429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.648300 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-rtckz\"" Apr 16 14:00:57.648429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.648298 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:00:57.648793 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.648779 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:57.648864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.648798 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:57.654991 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.654972 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:00:57.663297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.663277 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xpgj4"] Apr 16 14:00:57.667957 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.667934 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ntpr\" (UniqueName: \"kubernetes.io/projected/2310273c-d482-4d05-b71f-6db31c8a2fe1-kube-api-access-2ntpr\") pod \"volume-data-source-validator-7d955d5dd4-vjw2w\" (UID: \"2310273c-d482-4d05-b71f-6db31c8a2fe1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" Apr 16 14:00:57.763003 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.762971 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:00:57.765825 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.765807 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.768205 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqvb2\" (UniqueName: \"kubernetes.io/projected/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-kube-api-access-xqvb2\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.768343 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-snapshots\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.768343 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768293 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ntpr\" (UniqueName: \"kubernetes.io/projected/2310273c-d482-4d05-b71f-6db31c8a2fe1-kube-api-access-2ntpr\") pod \"volume-data-source-validator-7d955d5dd4-vjw2w\" (UID: \"2310273c-d482-4d05-b71f-6db31c8a2fe1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" Apr 16 14:00:57.768442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768344 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-tmp\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.768442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.768442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.768442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.768400 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-serving-cert\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.772479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.772456 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:00:57.772479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.772472 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:00:57.772656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.772473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:00:57.772656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.772503 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-7shls\"" Apr 16 14:00:57.781544 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.781513 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:00:57.783292 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.783241 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:00:57.793148 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.793121 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ntpr\" (UniqueName: \"kubernetes.io/projected/2310273c-d482-4d05-b71f-6db31c8a2fe1-kube-api-access-2ntpr\") pod \"volume-data-source-validator-7d955d5dd4-vjw2w\" (UID: \"2310273c-d482-4d05-b71f-6db31c8a2fe1\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" Apr 16 14:00:57.844957 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.844906 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" Apr 16 14:00:57.868870 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.868840 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-serving-cert\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869022 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.868881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869022 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869118 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869032 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869118 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869051 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869118 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869077 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjwn\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869224 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869177 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869224 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869193 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869296 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqvb2\" (UniqueName: \"kubernetes.io/projected/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-kube-api-access-xqvb2\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869296 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-snapshots\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869400 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-tmp\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869444 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869547 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869480 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.869849 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-tmp\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.869953 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.869895 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-snapshots\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.870110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.870092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.870367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.870348 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.872224 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.872202 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-serving-cert\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.881005 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.880980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqvb2\" (UniqueName: \"kubernetes.io/projected/7c051fe0-3220-4517-8c4c-4c0a8bf7518d-kube-api-access-xqvb2\") pod \"insights-operator-5785d4fcdd-xpgj4\" (UID: \"7c051fe0-3220-4517-8c4c-4c0a8bf7518d\") " pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.953712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.953680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" Apr 16 14:00:57.964032 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.964003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w"] Apr 16 14:00:57.969797 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.969773 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.969909 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.969802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.969909 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.969860 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970020 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.969946 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970020 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.969984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970020 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.970010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970220 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.970036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970220 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:57.970061 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:57.970220 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:57.970079 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-87948cf45-bnd9h: secret "image-registry-tls" not found Apr 16 14:00:57.970220 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:57.970142 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls podName:b61adc64-b31c-42dc-b211-b78b9427eac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:58.470121073 +0000 UTC m=+99.831663820 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls") pod "image-registry-87948cf45-bnd9h" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1") : secret "image-registry-tls" not found Apr 16 14:00:57.970220 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.970066 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjwn\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970631 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.970503 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.970736 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.970633 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.971142 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.971117 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.972284 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.972264 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.972794 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.972775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.979348 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.979324 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:57.980159 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:57.980136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjwn\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:58.067653 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:58.067561 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xpgj4"] Apr 16 14:00:58.070492 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:00:58.070457 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c051fe0_3220_4517_8c4c_4c0a8bf7518d.slice/crio-e1829549ebf2b297a7566a27956356544fc1fb88db8c183a84b94a94d6cec22b WatchSource:0}: Error finding container e1829549ebf2b297a7566a27956356544fc1fb88db8c183a84b94a94d6cec22b: Status 404 returned error can't find the container with id e1829549ebf2b297a7566a27956356544fc1fb88db8c183a84b94a94d6cec22b Apr 16 14:00:58.452527 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:58.452490 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" event={"ID":"7c051fe0-3220-4517-8c4c-4c0a8bf7518d","Type":"ContainerStarted","Data":"e1829549ebf2b297a7566a27956356544fc1fb88db8c183a84b94a94d6cec22b"} Apr 16 14:00:58.453412 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:58.453387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" event={"ID":"2310273c-d482-4d05-b71f-6db31c8a2fe1","Type":"ContainerStarted","Data":"e6648446ef72610ae76803a2d1babe445d811531bf174e5e50f9c6847ace6fda"} Apr 16 14:00:58.473750 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:58.473727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:58.473913 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:58.473894 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:58.473979 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:58.473916 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-87948cf45-bnd9h: secret "image-registry-tls" not found Apr 16 14:00:58.474029 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:58.473982 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls podName:b61adc64-b31c-42dc-b211-b78b9427eac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:59.473961831 +0000 UTC m=+100.835504603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls") pod "image-registry-87948cf45-bnd9h" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1") : secret "image-registry-tls" not found Apr 16 14:00:59.480825 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:00:59.480791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:00:59.481288 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:59.480962 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:59.481288 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:59.480987 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-87948cf45-bnd9h: secret "image-registry-tls" not found Apr 16 14:00:59.481288 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:00:59.481068 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls podName:b61adc64-b31c-42dc-b211-b78b9427eac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:01.481036777 +0000 UTC m=+102.842579524 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls") pod "image-registry-87948cf45-bnd9h" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1") : secret "image-registry-tls" not found Apr 16 14:01:01.460947 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.460900 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" event={"ID":"7c051fe0-3220-4517-8c4c-4c0a8bf7518d","Type":"ContainerStarted","Data":"9cc2fefad0c395c1080960422e04e9a7fcede633fa5121e69b4d2dbbd35b3dd6"} Apr 16 14:01:01.462164 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.462141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" event={"ID":"2310273c-d482-4d05-b71f-6db31c8a2fe1","Type":"ContainerStarted","Data":"0d736f707b127bdb12a2c095d89535b2dc9592b35a917003b1ff34d46251dbb6"} Apr 16 14:01:01.479697 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.479642 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" podStartSLOduration=1.6627293490000001 podStartE2EDuration="4.479628379s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:00:58.072164296 +0000 UTC m=+99.433707042" lastFinishedPulling="2026-04-16 14:01:00.889063321 +0000 UTC m=+102.250606072" observedRunningTime="2026-04-16 14:01:01.478780611 +0000 UTC m=+102.840323382" watchObservedRunningTime="2026-04-16 14:01:01.479628379 +0000 UTC m=+102.841171141" Apr 16 14:01:01.493759 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.493704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-vjw2w" podStartSLOduration=1.573974639 podStartE2EDuration="4.493685154s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="2026-04-16 14:00:57.969721818 +0000 UTC m=+99.331264565" lastFinishedPulling="2026-04-16 14:01:00.889432318 +0000 UTC m=+102.250975080" observedRunningTime="2026-04-16 14:01:01.493219563 +0000 UTC m=+102.854762323" watchObservedRunningTime="2026-04-16 14:01:01.493685154 +0000 UTC m=+102.855227924" Apr 16 14:01:01.496696 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.496668 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:01.496852 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:01.496832 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:01.496894 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:01.496857 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-87948cf45-bnd9h: secret "image-registry-tls" not found Apr 16 14:01:01.496940 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:01.496930 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls podName:b61adc64-b31c-42dc-b211-b78b9427eac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:05.496911046 +0000 UTC m=+106.858453796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls") pod "image-registry-87948cf45-bnd9h" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1") : secret "image-registry-tls" not found Apr 16 14:01:01.674432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.674398 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq"] Apr 16 14:01:01.677444 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.677426 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" Apr 16 14:01:01.679692 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.679673 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-2kldw\"" Apr 16 14:01:01.679806 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.679694 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:01:01.679806 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.679724 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:01:01.685527 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.685504 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq"] Apr 16 14:01:01.799484 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.799393 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tx6c\" (UniqueName: \"kubernetes.io/projected/956aabe3-df2b-45a4-bf9b-66468aff27cd-kube-api-access-7tx6c\") pod \"migrator-64d4d94569-jm8nq\" (UID: \"956aabe3-df2b-45a4-bf9b-66468aff27cd\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" Apr 16 14:01:01.899757 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.899719 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tx6c\" (UniqueName: \"kubernetes.io/projected/956aabe3-df2b-45a4-bf9b-66468aff27cd-kube-api-access-7tx6c\") pod \"migrator-64d4d94569-jm8nq\" (UID: \"956aabe3-df2b-45a4-bf9b-66468aff27cd\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" Apr 16 14:01:01.908700 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.908665 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tx6c\" (UniqueName: \"kubernetes.io/projected/956aabe3-df2b-45a4-bf9b-66468aff27cd-kube-api-access-7tx6c\") pod \"migrator-64d4d94569-jm8nq\" (UID: \"956aabe3-df2b-45a4-bf9b-66468aff27cd\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" Apr 16 14:01:01.987262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:01.987221 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" Apr 16 14:01:02.101831 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:02.101798 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq"] Apr 16 14:01:02.105272 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:02.105237 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod956aabe3_df2b_45a4_bf9b_66468aff27cd.slice/crio-c2dee2d815feb1ef6891275d972af6be9881ecf02d11cfa8ef3fc8a9b9063d5b WatchSource:0}: Error finding container c2dee2d815feb1ef6891275d972af6be9881ecf02d11cfa8ef3fc8a9b9063d5b: Status 404 returned error can't find the container with id c2dee2d815feb1ef6891275d972af6be9881ecf02d11cfa8ef3fc8a9b9063d5b Apr 16 14:01:02.465226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:02.465193 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" event={"ID":"956aabe3-df2b-45a4-bf9b-66468aff27cd","Type":"ContainerStarted","Data":"c2dee2d815feb1ef6891275d972af6be9881ecf02d11cfa8ef3fc8a9b9063d5b"} Apr 16 14:01:03.470303 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:03.470223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" event={"ID":"956aabe3-df2b-45a4-bf9b-66468aff27cd","Type":"ContainerStarted","Data":"5ca0e465c832f203604abdf124c806fd6e9b7dd9a65e98c6980fc5c46fde0408"} Apr 16 14:01:03.470303 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:03.470255 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" event={"ID":"956aabe3-df2b-45a4-bf9b-66468aff27cd","Type":"ContainerStarted","Data":"05931da6683c8f3854cd35e12a63e705e3485067ba794f8e8e18e7680da11f1c"} Apr 16 14:01:03.485123 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:03.485086 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-jm8nq" podStartSLOduration=1.388152103 podStartE2EDuration="2.485071655s" podCreationTimestamp="2026-04-16 14:01:01 +0000 UTC" firstStartedPulling="2026-04-16 14:01:02.107168171 +0000 UTC m=+103.468710921" lastFinishedPulling="2026-04-16 14:01:03.204087724 +0000 UTC m=+104.565630473" observedRunningTime="2026-04-16 14:01:03.485043786 +0000 UTC m=+104.846586563" watchObservedRunningTime="2026-04-16 14:01:03.485071655 +0000 UTC m=+104.846614402" Apr 16 14:01:05.038646 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:05.038618 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-65gv2_aa030d80-2a63-4669-acb7-9485b1b8ce4a/dns-node-resolver/0.log" Apr 16 14:01:05.526959 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:05.526918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:05.527175 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:05.527066 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:01:05.527175 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:05.527084 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-87948cf45-bnd9h: secret "image-registry-tls" not found Apr 16 14:01:05.527175 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:05.527140 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls podName:b61adc64-b31c-42dc-b211-b78b9427eac1 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:13.527124667 +0000 UTC m=+114.888667424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls") pod "image-registry-87948cf45-bnd9h" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1") : secret "image-registry-tls" not found Apr 16 14:01:06.438892 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:06.438865 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzrpt_567139c7-8d34-429b-bd38-0ab1aafa14e9/node-ca/0.log" Apr 16 14:01:07.437998 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:07.437973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-jm8nq_956aabe3-df2b-45a4-bf9b-66468aff27cd/migrator/0.log" Apr 16 14:01:07.641165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:07.641137 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-jm8nq_956aabe3-df2b-45a4-bf9b-66468aff27cd/graceful-termination/0.log" Apr 16 14:01:13.593531 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:13.593492 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:13.595956 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:13.595931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"image-registry-87948cf45-bnd9h\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:13.674106 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:13.674065 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:13.794599 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:13.794558 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:01:13.797331 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:13.797304 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61adc64_b31c_42dc_b211_b78b9427eac1.slice/crio-ea8cd9fbc6d867420979d0e4cc2dd3a7e725a9c860344d19da5bcee67850ee6b WatchSource:0}: Error finding container ea8cd9fbc6d867420979d0e4cc2dd3a7e725a9c860344d19da5bcee67850ee6b: Status 404 returned error can't find the container with id ea8cd9fbc6d867420979d0e4cc2dd3a7e725a9c860344d19da5bcee67850ee6b Apr 16 14:01:14.499429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:14.499392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" event={"ID":"b61adc64-b31c-42dc-b211-b78b9427eac1","Type":"ContainerStarted","Data":"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6"} Apr 16 14:01:14.499429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:14.499429 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" event={"ID":"b61adc64-b31c-42dc-b211-b78b9427eac1","Type":"ContainerStarted","Data":"ea8cd9fbc6d867420979d0e4cc2dd3a7e725a9c860344d19da5bcee67850ee6b"} Apr 16 14:01:14.499678 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:14.499518 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:14.522428 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:14.522381 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" podStartSLOduration=17.522365841 podStartE2EDuration="17.522365841s" podCreationTimestamp="2026-04-16 14:00:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:01:14.522148598 +0000 UTC m=+115.883691365" watchObservedRunningTime="2026-04-16 14:01:14.522365841 +0000 UTC m=+115.883908606" Apr 16 14:01:23.724320 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.724285 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-2th59"] Apr 16 14:01:23.727704 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.727683 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.730686 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.730660 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:01:23.730812 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.730660 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-g94nx\"" Apr 16 14:01:23.730812 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.730718 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:01:23.745168 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.745139 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2th59"] Apr 16 14:01:23.788291 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.788258 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:01:23.864684 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.864650 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.864894 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.864712 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-data-volume\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.864894 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.864781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.864894 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.864832 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-crio-socket\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.864894 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.864886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbgxl\" (UniqueName: \"kubernetes.io/projected/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-api-access-tbgxl\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.965624 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965562 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.965816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965658 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-data-volume\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.965816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965690 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.965816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965722 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-crio-socket\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.965816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965761 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tbgxl\" (UniqueName: \"kubernetes.io/projected/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-api-access-tbgxl\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.966014 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.965877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-crio-socket\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.966076 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.966053 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-data-volume\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.966292 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.966274 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.968005 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.967988 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:23.975187 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:23.975132 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbgxl\" (UniqueName: \"kubernetes.io/projected/27a36c64-4ff1-437f-8dc4-ca6ff5387bfd-kube-api-access-tbgxl\") pod \"insights-runtime-extractor-2th59\" (UID: \"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd\") " pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:24.035808 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:24.035767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-2th59" Apr 16 14:01:24.195935 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:24.195900 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27a36c64_4ff1_437f_8dc4_ca6ff5387bfd.slice/crio-e02fc167ad6a918194901774a095bd75834db14443a327df60d1565677c1f055 WatchSource:0}: Error finding container e02fc167ad6a918194901774a095bd75834db14443a327df60d1565677c1f055: Status 404 returned error can't find the container with id e02fc167ad6a918194901774a095bd75834db14443a327df60d1565677c1f055 Apr 16 14:01:24.196627 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:24.196572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-2th59"] Apr 16 14:01:24.523916 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:24.523836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2th59" event={"ID":"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd","Type":"ContainerStarted","Data":"52527072009dc73ab9c42b4a5a547713cbb72d71d9ab1336366a022092711715"} Apr 16 14:01:24.523916 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:24.523875 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2th59" event={"ID":"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd","Type":"ContainerStarted","Data":"e02fc167ad6a918194901774a095bd75834db14443a327df60d1565677c1f055"} Apr 16 14:01:25.528257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:25.528223 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2th59" event={"ID":"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd","Type":"ContainerStarted","Data":"99b154a78634e9678b8a03fdebc849bddf473a814b702fb12b1f4fc6997884a8"} Apr 16 14:01:26.533892 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:26.533857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-2th59" event={"ID":"27a36c64-4ff1-437f-8dc4-ca6ff5387bfd","Type":"ContainerStarted","Data":"51c57351ee9e18cc80f334cb9e61c71a11905960e5f2d3b212cc7ab0c922538d"} Apr 16 14:01:26.552047 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:26.552005 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-2th59" podStartSLOduration=1.401486243 podStartE2EDuration="3.551991819s" podCreationTimestamp="2026-04-16 14:01:23 +0000 UTC" firstStartedPulling="2026-04-16 14:01:24.243510723 +0000 UTC m=+125.605053470" lastFinishedPulling="2026-04-16 14:01:26.394016296 +0000 UTC m=+127.755559046" observedRunningTime="2026-04-16 14:01:26.55126859 +0000 UTC m=+127.912811369" watchObservedRunningTime="2026-04-16 14:01:26.551991819 +0000 UTC m=+127.913534649" Apr 16 14:01:28.904374 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:28.904318 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 14:01:28.906740 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:28.906715 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3d0ab572-848b-495c-afdf-ad744ea2b230-metrics-certs\") pod \"network-metrics-daemon-fbnhb\" (UID: \"3d0ab572-848b-495c-afdf-ad744ea2b230\") " pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 14:01:29.102923 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:29.102887 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x5z8x\"" Apr 16 14:01:29.111954 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:29.111926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fbnhb" Apr 16 14:01:29.227220 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:29.227188 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fbnhb"] Apr 16 14:01:29.231492 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:29.231463 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d0ab572_848b_495c_afdf_ad744ea2b230.slice/crio-ef4fd6ed2748d1035d1bd2bc388c2571cb788049ff113ba0ac79c7790b6ddae5 WatchSource:0}: Error finding container ef4fd6ed2748d1035d1bd2bc388c2571cb788049ff113ba0ac79c7790b6ddae5: Status 404 returned error can't find the container with id ef4fd6ed2748d1035d1bd2bc388c2571cb788049ff113ba0ac79c7790b6ddae5 Apr 16 14:01:29.542934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:29.542843 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbnhb" event={"ID":"3d0ab572-848b-495c-afdf-ad744ea2b230","Type":"ContainerStarted","Data":"ef4fd6ed2748d1035d1bd2bc388c2571cb788049ff113ba0ac79c7790b6ddae5"} Apr 16 14:01:30.547019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:30.546930 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbnhb" event={"ID":"3d0ab572-848b-495c-afdf-ad744ea2b230","Type":"ContainerStarted","Data":"3e5635aaaea51b3c7fa23466b941c957052db37f15468c37b1ea06dd31bc8410"} Apr 16 14:01:30.547019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:30.546969 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fbnhb" event={"ID":"3d0ab572-848b-495c-afdf-ad744ea2b230","Type":"ContainerStarted","Data":"6031a9a31fa9dec0501b2592962459262a0f85f790ec56cdf901e7cd2e22a0e0"} Apr 16 14:01:33.792690 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:33.792653 2574 patch_prober.go:28] interesting pod/image-registry-87948cf45-bnd9h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:33.793078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:33.792711 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:36.217056 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.217003 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fbnhb" podStartSLOduration=136.254934438 podStartE2EDuration="2m17.216987497s" podCreationTimestamp="2026-04-16 13:59:19 +0000 UTC" firstStartedPulling="2026-04-16 14:01:29.233802267 +0000 UTC m=+130.595345017" lastFinishedPulling="2026-04-16 14:01:30.195855104 +0000 UTC m=+131.557398076" observedRunningTime="2026-04-16 14:01:30.56382288 +0000 UTC m=+131.925365649" watchObservedRunningTime="2026-04-16 14:01:36.216987497 +0000 UTC m=+137.578530265" Apr 16 14:01:36.217555 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.217539 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4bf7x"] Apr 16 14:01:36.219528 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.219511 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.221627 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.221605 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:01:36.222542 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.222521 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-xlqh7\"" Apr 16 14:01:36.222634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.222547 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:01:36.222634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.222559 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:36.222724 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.222661 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:36.222823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.222808 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:36.227881 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.227861 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4bf7x"] Apr 16 14:01:36.359142 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.359102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.359315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.359162 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9wvc\" (UniqueName: \"kubernetes.io/projected/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-kube-api-access-r9wvc\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.359315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.359229 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.359315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.359274 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.459808 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.459778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.459977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.459827 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9wvc\" (UniqueName: \"kubernetes.io/projected/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-kube-api-access-r9wvc\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.459977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.459847 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.459977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.459878 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.460563 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.460539 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-metrics-client-ca\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.462188 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.462165 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.462270 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.462169 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.470387 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.470336 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9wvc\" (UniqueName: \"kubernetes.io/projected/d84ae69b-8b20-4dc2-bb8b-46889ca16d3e-kube-api-access-r9wvc\") pod \"prometheus-operator-78f957474d-4bf7x\" (UID: \"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e\") " pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.528549 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.528520 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" Apr 16 14:01:36.642145 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:36.642120 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-4bf7x"] Apr 16 14:01:36.644667 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:36.644638 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd84ae69b_8b20_4dc2_bb8b_46889ca16d3e.slice/crio-92291d805fbb3f6d02e135c9c6fdd49e06ae4e2d60e4041f44e4ef1cd69e2bf4 WatchSource:0}: Error finding container 92291d805fbb3f6d02e135c9c6fdd49e06ae4e2d60e4041f44e4ef1cd69e2bf4: Status 404 returned error can't find the container with id 92291d805fbb3f6d02e135c9c6fdd49e06ae4e2d60e4041f44e4ef1cd69e2bf4 Apr 16 14:01:37.570617 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:37.570559 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" event={"ID":"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e","Type":"ContainerStarted","Data":"92291d805fbb3f6d02e135c9c6fdd49e06ae4e2d60e4041f44e4ef1cd69e2bf4"} Apr 16 14:01:38.574549 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:38.574510 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" event={"ID":"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e","Type":"ContainerStarted","Data":"fdcf0411b055e98a294d67fd2bdd8a6d1ad6b84b57e88d0113fe9c159a2c71c8"} Apr 16 14:01:38.574949 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:38.574555 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" event={"ID":"d84ae69b-8b20-4dc2-bb8b-46889ca16d3e","Type":"ContainerStarted","Data":"89ba49dd85aa50b1be5a710ee7d9783c5a9b3f8ccaa6e74105648b1af84d38ed"} Apr 16 14:01:38.592339 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:38.592288 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-4bf7x" podStartSLOduration=1.37440092 podStartE2EDuration="2.5922745s" podCreationTimestamp="2026-04-16 14:01:36 +0000 UTC" firstStartedPulling="2026-04-16 14:01:36.646337317 +0000 UTC m=+138.007880063" lastFinishedPulling="2026-04-16 14:01:37.864210897 +0000 UTC m=+139.225753643" observedRunningTime="2026-04-16 14:01:38.591263214 +0000 UTC m=+139.952805982" watchObservedRunningTime="2026-04-16 14:01:38.5922745 +0000 UTC m=+139.953817273" Apr 16 14:01:40.635144 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.635113 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gwdbk"] Apr 16 14:01:40.637244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.637227 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.639447 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.639425 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:40.639598 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.639457 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wstcq\"" Apr 16 14:01:40.639707 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.639694 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:40.639877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.639864 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:40.797313 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797272 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797313 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797317 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797530 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797406 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797530 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-metrics-client-ca\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797530 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrt6\" (UniqueName: \"kubernetes.io/projected/bbcf63e7-182a-4aaf-a012-585f17a5a74a-kube-api-access-rlrt6\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797530 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-root\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797534 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-wtmp\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797561 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-sys\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.797687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.797599 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-textfile\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898618 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898515 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898618 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-metrics-client-ca\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898618 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898596 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrt6\" (UniqueName: \"kubernetes.io/projected/bbcf63e7-182a-4aaf-a012-585f17a5a74a-kube-api-access-rlrt6\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-root\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-wtmp\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898680 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-sys\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:40.898683 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:40.898770 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls podName:bbcf63e7-182a-4aaf-a012-585f17a5a74a nodeName:}" failed. No retries permitted until 2026-04-16 14:01:41.398745833 +0000 UTC m=+142.760288597 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls") pod "node-exporter-gwdbk" (UID: "bbcf63e7-182a-4aaf-a012-585f17a5a74a") : secret "node-exporter-tls" not found Apr 16 14:01:40.898823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-sys\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899063 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898813 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-wtmp\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899063 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898845 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/bbcf63e7-182a-4aaf-a012-585f17a5a74a-root\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899063 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.898879 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-textfile\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899063 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.899020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899063 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.899050 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.899262 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-textfile\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.899267 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-metrics-client-ca\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.899544 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.899514 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-accelerators-collector-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.901460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.901425 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:40.922262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:40.922231 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrt6\" (UniqueName: \"kubernetes.io/projected/bbcf63e7-182a-4aaf-a012-585f17a5a74a-kube-api-access-rlrt6\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:41.402346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:41.402308 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:41.404565 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:41.404546 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/bbcf63e7-182a-4aaf-a012-585f17a5a74a-node-exporter-tls\") pod \"node-exporter-gwdbk\" (UID: \"bbcf63e7-182a-4aaf-a012-585f17a5a74a\") " pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:41.546298 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:41.546265 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gwdbk" Apr 16 14:01:41.554442 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:41.554415 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbcf63e7_182a_4aaf_a012_585f17a5a74a.slice/crio-ecf0672412cb35af8797850527a42440fb57180ccc00b4a186fb4fdc9d98e807 WatchSource:0}: Error finding container ecf0672412cb35af8797850527a42440fb57180ccc00b4a186fb4fdc9d98e807: Status 404 returned error can't find the container with id ecf0672412cb35af8797850527a42440fb57180ccc00b4a186fb4fdc9d98e807 Apr 16 14:01:41.582566 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:41.582530 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gwdbk" event={"ID":"bbcf63e7-182a-4aaf-a012-585f17a5a74a","Type":"ContainerStarted","Data":"ecf0672412cb35af8797850527a42440fb57180ccc00b4a186fb4fdc9d98e807"} Apr 16 14:01:43.793873 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:43.793842 2574 patch_prober.go:28] interesting pod/image-registry-87948cf45-bnd9h container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:43.794395 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:43.793895 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:44.591993 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:44.591956 2574 generic.go:358] "Generic (PLEG): container finished" podID="bbcf63e7-182a-4aaf-a012-585f17a5a74a" containerID="07238ee4939d9edf150c9108d52bae06e76f4b869d0f9db56597e66b85f060ab" exitCode=0 Apr 16 14:01:44.592155 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:44.592000 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gwdbk" event={"ID":"bbcf63e7-182a-4aaf-a012-585f17a5a74a","Type":"ContainerDied","Data":"07238ee4939d9edf150c9108d52bae06e76f4b869d0f9db56597e66b85f060ab"} Apr 16 14:01:45.596476 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:45.596437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gwdbk" event={"ID":"bbcf63e7-182a-4aaf-a012-585f17a5a74a","Type":"ContainerStarted","Data":"dc43bab2905448744b1411dc908b28787482320f52a9e816656bde68dc217c61"} Apr 16 14:01:45.596476 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:45.596480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gwdbk" event={"ID":"bbcf63e7-182a-4aaf-a012-585f17a5a74a","Type":"ContainerStarted","Data":"bf0e06f9887e0733611825787020a41fa68f03df7dd0c5a682a36b959f00b644"} Apr 16 14:01:45.616934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:45.616886 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gwdbk" podStartSLOduration=3.185094746 podStartE2EDuration="5.616871006s" podCreationTimestamp="2026-04-16 14:01:40 +0000 UTC" firstStartedPulling="2026-04-16 14:01:41.556362911 +0000 UTC m=+142.917905660" lastFinishedPulling="2026-04-16 14:01:43.988139164 +0000 UTC m=+145.349681920" observedRunningTime="2026-04-16 14:01:45.615953671 +0000 UTC m=+146.977496438" watchObservedRunningTime="2026-04-16 14:01:45.616871006 +0000 UTC m=+146.978413751" Apr 16 14:01:46.816090 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.816055 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:46.819375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.819351 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.823594 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.823559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:01:46.823931 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.823902 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:01:46.824035 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.823910 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:01:46.824035 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.823970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:01:46.825017 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825000 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eolrm4d9hr1ql\"" Apr 16 14:01:46.825099 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825015 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mxmxm\"" Apr 16 14:01:46.825099 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825043 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:01:46.825293 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825279 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:01:46.825514 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825496 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:01:46.825514 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:01:46.825674 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825543 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:01:46.825674 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825506 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:01:46.825674 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825632 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:46.825818 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.825546 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:01:46.828688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.828522 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839607 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839703 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839781 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839807 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839837 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839876 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839930 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.840211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.839984 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m72cq\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.841061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.840016 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.841061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.840044 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.850048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.849656 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:46.940989 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.940950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.940989 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.940996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941020 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941045 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941072 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941126 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941153 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941308 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941299 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941341 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941422 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m72cq\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941494 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941528 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941560 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.941982 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.941664 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.942064 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.942030 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.942331 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.942308 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.943281 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.943255 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.944373 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.944346 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.944903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.944850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.945410 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.945384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.945512 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.945439 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.945599 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.945551 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.946009 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.945971 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.946158 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.946136 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.946659 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.946618 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.946753 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.946678 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.946940 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.946918 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.947013 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.946998 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.947387 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.947367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.947676 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.947660 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:46.954951 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:46.954911 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m72cq\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq\") pod \"prometheus-k8s-0\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:47.131615 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:47.131563 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:47.286899 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:47.286872 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:47.288435 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:01:47.288396 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90c6d285_34cf_4fa2_ae69_870acdbe1454.slice/crio-f8524dd7a6dbdf7df3e933a8f99cac45cf7cb9321427dc64eb9341ee1c4a4f42 WatchSource:0}: Error finding container f8524dd7a6dbdf7df3e933a8f99cac45cf7cb9321427dc64eb9341ee1c4a4f42: Status 404 returned error can't find the container with id f8524dd7a6dbdf7df3e933a8f99cac45cf7cb9321427dc64eb9341ee1c4a4f42 Apr 16 14:01:47.603418 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:47.603387 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"f8524dd7a6dbdf7df3e933a8f99cac45cf7cb9321427dc64eb9341ee1c4a4f42"} Apr 16 14:01:48.607210 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:48.607174 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0" exitCode=0 Apr 16 14:01:48.607607 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:48.607262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0"} Apr 16 14:01:48.806302 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:48.806262 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" containerID="cri-o://7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6" gracePeriod=30 Apr 16 14:01:48.832092 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:48.832067 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb61adc64_b31c_42dc_b211_b78b9427eac1.slice/crio-7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:01:49.029039 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.029016 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:49.060509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060483 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060660 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060519 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjwn\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060660 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060545 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060660 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060564 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060827 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060760 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060827 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060800 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060929 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060848 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.060929 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.060892 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted\") pod \"b61adc64-b31c-42dc-b211-b78b9427eac1\" (UID: \"b61adc64-b31c-42dc-b211-b78b9427eac1\") " Apr 16 14:01:49.061505 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.061029 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:49.061505 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.061329 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-certificates\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.061505 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.061338 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:49.063337 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.063305 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:49.063337 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.063322 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn" (OuterVolumeSpecName: "kube-api-access-8bjwn") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "kube-api-access-8bjwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:49.063493 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.063414 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:49.063549 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.063513 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:49.063984 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.063962 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:49.070880 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.070854 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b61adc64-b31c-42dc-b211-b78b9427eac1" (UID: "b61adc64-b31c-42dc-b211-b78b9427eac1"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:49.162048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162019 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b61adc64-b31c-42dc-b211-b78b9427eac1-ca-trust-extracted\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162049 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-installation-pull-secrets\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162062 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bjwn\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-kube-api-access-8bjwn\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162072 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b61adc64-b31c-42dc-b211-b78b9427eac1-image-registry-private-configuration\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162082 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-registry-tls\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162090 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b61adc64-b31c-42dc-b211-b78b9427eac1-trusted-ca\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.162248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.162098 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b61adc64-b31c-42dc-b211-b78b9427eac1-bound-sa-token\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:01:49.611679 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.611642 2574 generic.go:358] "Generic (PLEG): container finished" podID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerID="7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6" exitCode=0 Apr 16 14:01:49.612123 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.611725 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" Apr 16 14:01:49.612123 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.611723 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" event={"ID":"b61adc64-b31c-42dc-b211-b78b9427eac1","Type":"ContainerDied","Data":"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6"} Apr 16 14:01:49.612123 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.611857 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-87948cf45-bnd9h" event={"ID":"b61adc64-b31c-42dc-b211-b78b9427eac1","Type":"ContainerDied","Data":"ea8cd9fbc6d867420979d0e4cc2dd3a7e725a9c860344d19da5bcee67850ee6b"} Apr 16 14:01:49.612123 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.611881 2574 scope.go:117] "RemoveContainer" containerID="7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6" Apr 16 14:01:49.620383 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.620366 2574 scope.go:117] "RemoveContainer" containerID="7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6" Apr 16 14:01:49.620705 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:49.620678 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6\": container with ID starting with 7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6 not found: ID does not exist" containerID="7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6" Apr 16 14:01:49.620806 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.620715 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6"} err="failed to get container status \"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6\": rpc error: code = NotFound desc = could not find container \"7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6\": container with ID starting with 7ae67480bc50e79d6ab89122ceb0126fa8b9fb334c96c833b989e4b60987c6c6 not found: ID does not exist" Apr 16 14:01:49.628017 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.627992 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:01:49.631935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:49.631906 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-87948cf45-bnd9h"] Apr 16 14:01:51.187168 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:51.187134 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" path="/var/lib/kubelet/pods/b61adc64-b31c-42dc-b211-b78b9427eac1/volumes" Apr 16 14:01:51.620163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:51.620130 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca"} Apr 16 14:01:51.620163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:51.620166 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429"} Apr 16 14:01:54.629469 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:54.629437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705"} Apr 16 14:01:54.629469 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:54.629472 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24"} Apr 16 14:01:54.629987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:54.629483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f"} Apr 16 14:01:54.629987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:54.629494 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerStarted","Data":"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5"} Apr 16 14:01:54.660411 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:54.660363 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.18684603 podStartE2EDuration="8.660346841s" podCreationTimestamp="2026-04-16 14:01:46 +0000 UTC" firstStartedPulling="2026-04-16 14:01:47.290417471 +0000 UTC m=+148.651960216" lastFinishedPulling="2026-04-16 14:01:53.76391827 +0000 UTC m=+155.125461027" observedRunningTime="2026-04-16 14:01:54.658840479 +0000 UTC m=+156.020383246" watchObservedRunningTime="2026-04-16 14:01:54.660346841 +0000 UTC m=+156.021889608" Apr 16 14:01:55.065334 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:55.065230 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-84w8w" podUID="defde0b7-6e87-43ef-ad17-0ce3ffc5d902" Apr 16 14:01:55.073387 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:01:55.073344 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-2bcnf" podUID="e05dc6e0-3aae-437d-a7bd-6b5851441185" Apr 16 14:01:55.632326 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:55.632296 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:01:55.632718 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:55.632298 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-84w8w" Apr 16 14:01:57.132128 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:57.132087 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:59.955946 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:59.955891 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:01:59.956334 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:59.955965 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 14:01:59.958304 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:59.958281 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/defde0b7-6e87-43ef-ad17-0ce3ffc5d902-metrics-tls\") pod \"dns-default-84w8w\" (UID: \"defde0b7-6e87-43ef-ad17-0ce3ffc5d902\") " pod="openshift-dns/dns-default-84w8w" Apr 16 14:01:59.958753 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:01:59.958733 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e05dc6e0-3aae-437d-a7bd-6b5851441185-cert\") pod \"ingress-canary-2bcnf\" (UID: \"e05dc6e0-3aae-437d-a7bd-6b5851441185\") " pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:02:00.135254 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.135224 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tpzlf\"" Apr 16 14:02:00.135810 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.135787 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbpqw\"" Apr 16 14:02:00.143668 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.143650 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-84w8w" Apr 16 14:02:00.143764 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.143729 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2bcnf" Apr 16 14:02:00.267407 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.267378 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2bcnf"] Apr 16 14:02:00.269371 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:02:00.269343 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode05dc6e0_3aae_437d_a7bd_6b5851441185.slice/crio-79d2eb41ac405c63bb68654efaf3048c4be501c5876ae77df5a28173cbc6dfe4 WatchSource:0}: Error finding container 79d2eb41ac405c63bb68654efaf3048c4be501c5876ae77df5a28173cbc6dfe4: Status 404 returned error can't find the container with id 79d2eb41ac405c63bb68654efaf3048c4be501c5876ae77df5a28173cbc6dfe4 Apr 16 14:02:00.282019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.281958 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-84w8w"] Apr 16 14:02:00.284194 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:02:00.284171 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddefde0b7_6e87_43ef_ad17_0ce3ffc5d902.slice/crio-0e12a3e5855735349056b816d0420c45f674bfbf4d0c91c59abf1f20e74f25f8 WatchSource:0}: Error finding container 0e12a3e5855735349056b816d0420c45f674bfbf4d0c91c59abf1f20e74f25f8: Status 404 returned error can't find the container with id 0e12a3e5855735349056b816d0420c45f674bfbf4d0c91c59abf1f20e74f25f8 Apr 16 14:02:00.646513 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.646479 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2bcnf" event={"ID":"e05dc6e0-3aae-437d-a7bd-6b5851441185","Type":"ContainerStarted","Data":"79d2eb41ac405c63bb68654efaf3048c4be501c5876ae77df5a28173cbc6dfe4"} Apr 16 14:02:00.647341 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:00.647306 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-84w8w" event={"ID":"defde0b7-6e87-43ef-ad17-0ce3ffc5d902","Type":"ContainerStarted","Data":"0e12a3e5855735349056b816d0420c45f674bfbf4d0c91c59abf1f20e74f25f8"} Apr 16 14:02:02.658369 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.658283 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2bcnf" event={"ID":"e05dc6e0-3aae-437d-a7bd-6b5851441185","Type":"ContainerStarted","Data":"92fa3c205a0ccf4475c6d9db60933bb7c83f486babb9888f1b3ae8ba147ab39d"} Apr 16 14:02:02.659919 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.659894 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-84w8w" event={"ID":"defde0b7-6e87-43ef-ad17-0ce3ffc5d902","Type":"ContainerStarted","Data":"dde789e7eac2620401026e84f5d4c341d28a7a24586f8c3f664369b02b8a25db"} Apr 16 14:02:02.660020 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.659928 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-84w8w" event={"ID":"defde0b7-6e87-43ef-ad17-0ce3ffc5d902","Type":"ContainerStarted","Data":"72007e6a959674eed2c7df5688bd88a4fde1f0b8ca5528bf7aad6366b77ea9fa"} Apr 16 14:02:02.660055 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.660012 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-84w8w" Apr 16 14:02:02.676670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.676629 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2bcnf" podStartSLOduration=128.561942551 podStartE2EDuration="2m10.676617146s" podCreationTimestamp="2026-04-16 13:59:52 +0000 UTC" firstStartedPulling="2026-04-16 14:02:00.271560813 +0000 UTC m=+161.633103559" lastFinishedPulling="2026-04-16 14:02:02.386235408 +0000 UTC m=+163.747778154" observedRunningTime="2026-04-16 14:02:02.675567132 +0000 UTC m=+164.037109900" watchObservedRunningTime="2026-04-16 14:02:02.676617146 +0000 UTC m=+164.038159910" Apr 16 14:02:02.694456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:02.694402 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-84w8w" podStartSLOduration=128.593546972 podStartE2EDuration="2m10.694384161s" podCreationTimestamp="2026-04-16 13:59:52 +0000 UTC" firstStartedPulling="2026-04-16 14:02:00.285903339 +0000 UTC m=+161.647446088" lastFinishedPulling="2026-04-16 14:02:02.386740529 +0000 UTC m=+163.748283277" observedRunningTime="2026-04-16 14:02:02.693305054 +0000 UTC m=+164.054847827" watchObservedRunningTime="2026-04-16 14:02:02.694384161 +0000 UTC m=+164.055926931" Apr 16 14:02:12.665128 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:12.665098 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-84w8w" Apr 16 14:02:32.745274 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:32.745231 2574 generic.go:358] "Generic (PLEG): container finished" podID="7c051fe0-3220-4517-8c4c-4c0a8bf7518d" containerID="9cc2fefad0c395c1080960422e04e9a7fcede633fa5121e69b4d2dbbd35b3dd6" exitCode=0 Apr 16 14:02:32.745886 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:32.745311 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" event={"ID":"7c051fe0-3220-4517-8c4c-4c0a8bf7518d","Type":"ContainerDied","Data":"9cc2fefad0c395c1080960422e04e9a7fcede633fa5121e69b4d2dbbd35b3dd6"} Apr 16 14:02:32.745886 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:32.745666 2574 scope.go:117] "RemoveContainer" containerID="9cc2fefad0c395c1080960422e04e9a7fcede633fa5121e69b4d2dbbd35b3dd6" Apr 16 14:02:33.749226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:33.749194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xpgj4" event={"ID":"7c051fe0-3220-4517-8c4c-4c0a8bf7518d","Type":"ContainerStarted","Data":"98db041841fd5d24e7862f30bad907d056602e91b769c2847e7b26c6b20321bd"} Apr 16 14:02:38.514525 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:38.514497 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/init-textfile/0.log" Apr 16 14:02:38.655499 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:38.655472 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/node-exporter/0.log" Apr 16 14:02:38.857503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:38.857474 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/kube-rbac-proxy/0.log" Apr 16 14:02:39.655022 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:39.654995 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/init-config-reloader/0.log" Apr 16 14:02:39.859242 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:39.859215 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/prometheus/0.log" Apr 16 14:02:40.056233 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:40.056156 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/config-reloader/0.log" Apr 16 14:02:40.256852 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:40.256826 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/thanos-sidecar/0.log" Apr 16 14:02:40.457821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:40.457791 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/kube-rbac-proxy-web/0.log" Apr 16 14:02:40.655203 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:40.655177 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/kube-rbac-proxy/0.log" Apr 16 14:02:40.856902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:40.856875 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_90c6d285-34cf-4fa2-ae69-870acdbe1454/kube-rbac-proxy-thanos/0.log" Apr 16 14:02:41.064137 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:41.064111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4bf7x_d84ae69b-8b20-4dc2-bb8b-46889ca16d3e/prometheus-operator/0.log" Apr 16 14:02:41.255733 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:41.255659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4bf7x_d84ae69b-8b20-4dc2-bb8b-46889ca16d3e/kube-rbac-proxy/0.log" Apr 16 14:02:45.054404 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:45.054377 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzrpt_567139c7-8d34-429b-bd38-0ab1aafa14e9/node-ca/0.log" Apr 16 14:02:45.454718 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:45.454688 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2bcnf_e05dc6e0-3aae-437d-a7bd-6b5851441185/serve-healthcheck-canary/0.log" Apr 16 14:02:47.132336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:47.132302 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:47.164120 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:47.163801 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:47.808187 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:02:47.808156 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:05.294696 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.294662 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:05.295952 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.295892 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-thanos" containerID="cri-o://f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705" gracePeriod=600 Apr 16 14:03:05.296614 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.296561 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="prometheus" containerID="cri-o://f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429" gracePeriod=600 Apr 16 14:03:05.296744 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.296622 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="thanos-sidecar" containerID="cri-o://b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5" gracePeriod=600 Apr 16 14:03:05.296744 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.296649 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy" containerID="cri-o://b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24" gracePeriod=600 Apr 16 14:03:05.296744 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.296712 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="config-reloader" containerID="cri-o://faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca" gracePeriod=600 Apr 16 14:03:05.297243 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.296572 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-web" containerID="cri-o://19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f" gracePeriod=600 Apr 16 14:03:05.843573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843543 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705" exitCode=0 Apr 16 14:03:05.843573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843565 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24" exitCode=0 Apr 16 14:03:05.843573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843573 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5" exitCode=0 Apr 16 14:03:05.843573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843598 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca" exitCode=0 Apr 16 14:03:05.843573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843603 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429" exitCode=0 Apr 16 14:03:05.843877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843612 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705"} Apr 16 14:03:05.843877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843646 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24"} Apr 16 14:03:05.843877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843656 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5"} Apr 16 14:03:05.843877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843665 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca"} Apr 16 14:03:05.843877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:05.843673 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429"} Apr 16 14:03:06.536110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.536087 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.649460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649366 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649408 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649428 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649452 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649470 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649675 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649733 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649762 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m72cq\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649796 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649831 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649855 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649902 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649933 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.649987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.650257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.649991 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.650257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.650017 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.650257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.650056 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.650257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.650092 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle\") pod \"90c6d285-34cf-4fa2-ae69-870acdbe1454\" (UID: \"90c6d285-34cf-4fa2-ae69-870acdbe1454\") " Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.650679 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.651135 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.651137 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.651326 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.651409 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:06.651772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.651742 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:03:06.652329 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.652298 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.652446 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.652421 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config" (OuterVolumeSpecName: "config") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.652506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.652486 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.653795 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.653769 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.653795 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.653785 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.653946 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.653836 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.654001 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.653950 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.654400 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.654377 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:06.654535 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.654517 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out" (OuterVolumeSpecName: "config-out") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:03:06.654923 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.654896 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq" (OuterVolumeSpecName: "kube-api-access-m72cq") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "kube-api-access-m72cq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:03:06.655190 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.655159 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.664439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.664346 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config" (OuterVolumeSpecName: "web-config") pod "90c6d285-34cf-4fa2-ae69-870acdbe1454" (UID: "90c6d285-34cf-4fa2-ae69-870acdbe1454"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:03:06.750905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750855 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-db\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.750905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750897 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m72cq\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-kube-api-access-m72cq\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.750905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750907 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-metrics-client-ca\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.750905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750919 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-metrics-client-certs\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750928 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-kube-rbac-proxy\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750939 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/90c6d285-34cf-4fa2-ae69-870acdbe1454-tls-assets\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750949 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-tls\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750960 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750969 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-web-config\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750979 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750988 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.750997 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-trusted-ca-bundle\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751005 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/90c6d285-34cf-4fa2-ae69-870acdbe1454-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751015 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-config\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751023 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-thanos-prometheus-http-client-file\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751032 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/90c6d285-34cf-4fa2-ae69-870acdbe1454-config-out\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751040 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.751165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.751048 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/90c6d285-34cf-4fa2-ae69-870acdbe1454-secret-grpc-tls\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:03:06.848774 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.848738 2574 generic.go:358] "Generic (PLEG): container finished" podID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerID="19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f" exitCode=0 Apr 16 14:03:06.848774 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.848776 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f"} Apr 16 14:03:06.849030 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.848803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"90c6d285-34cf-4fa2-ae69-870acdbe1454","Type":"ContainerDied","Data":"f8524dd7a6dbdf7df3e933a8f99cac45cf7cb9321427dc64eb9341ee1c4a4f42"} Apr 16 14:03:06.849030 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.848821 2574 scope.go:117] "RemoveContainer" containerID="f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705" Apr 16 14:03:06.849030 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.848871 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.855977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.855958 2574 scope.go:117] "RemoveContainer" containerID="b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24" Apr 16 14:03:06.863945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.863924 2574 scope.go:117] "RemoveContainer" containerID="19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f" Apr 16 14:03:06.870316 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.870299 2574 scope.go:117] "RemoveContainer" containerID="b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5" Apr 16 14:03:06.874286 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.874261 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:06.876514 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.876498 2574 scope.go:117] "RemoveContainer" containerID="faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca" Apr 16 14:03:06.881139 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.881117 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:06.883377 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.883276 2574 scope.go:117] "RemoveContainer" containerID="f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429" Apr 16 14:03:06.889928 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.889910 2574 scope.go:117] "RemoveContainer" containerID="df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0" Apr 16 14:03:06.895913 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.895897 2574 scope.go:117] "RemoveContainer" containerID="f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705" Apr 16 14:03:06.896148 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.896130 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705\": container with ID starting with f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705 not found: ID does not exist" containerID="f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705" Apr 16 14:03:06.896211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896161 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705"} err="failed to get container status \"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705\": rpc error: code = NotFound desc = could not find container \"f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705\": container with ID starting with f1176de6e67092e15de2e981cae29c8e3a812b5ba48f1e3fb4500e46e57d6705 not found: ID does not exist" Apr 16 14:03:06.896211 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896189 2574 scope.go:117] "RemoveContainer" containerID="b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24" Apr 16 14:03:06.896401 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.896387 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24\": container with ID starting with b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24 not found: ID does not exist" containerID="b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24" Apr 16 14:03:06.896459 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896409 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24"} err="failed to get container status \"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24\": rpc error: code = NotFound desc = could not find container \"b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24\": container with ID starting with b6bd57ab3d3c61c4cf9d71482b0380c34c2f58b797c6233a485a5575aec7cf24 not found: ID does not exist" Apr 16 14:03:06.896459 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896428 2574 scope.go:117] "RemoveContainer" containerID="19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f" Apr 16 14:03:06.896652 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.896630 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f\": container with ID starting with 19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f not found: ID does not exist" containerID="19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f" Apr 16 14:03:06.896694 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896658 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f"} err="failed to get container status \"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f\": rpc error: code = NotFound desc = could not find container \"19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f\": container with ID starting with 19de27c1e03d7478c72f9d239466a6801754097b586281c69339570bcdd7965f not found: ID does not exist" Apr 16 14:03:06.896694 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896674 2574 scope.go:117] "RemoveContainer" containerID="b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5" Apr 16 14:03:06.896890 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.896875 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5\": container with ID starting with b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5 not found: ID does not exist" containerID="b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5" Apr 16 14:03:06.896934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896893 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5"} err="failed to get container status \"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5\": rpc error: code = NotFound desc = could not find container \"b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5\": container with ID starting with b9c4f8351ec67323855f7bce52458222bdae13cd70cc91e85cc3e21d6eb852d5 not found: ID does not exist" Apr 16 14:03:06.896934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.896905 2574 scope.go:117] "RemoveContainer" containerID="faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca" Apr 16 14:03:06.897126 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.897109 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca\": container with ID starting with faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca not found: ID does not exist" containerID="faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca" Apr 16 14:03:06.897179 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.897135 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca"} err="failed to get container status \"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca\": rpc error: code = NotFound desc = could not find container \"faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca\": container with ID starting with faca0be3660f25affb5a1155d828beafed1191d59a2f9a73e33c1b21cad781ca not found: ID does not exist" Apr 16 14:03:06.897179 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.897155 2574 scope.go:117] "RemoveContainer" containerID="f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429" Apr 16 14:03:06.897383 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.897366 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429\": container with ID starting with f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429 not found: ID does not exist" containerID="f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429" Apr 16 14:03:06.897422 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.897388 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429"} err="failed to get container status \"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429\": rpc error: code = NotFound desc = could not find container \"f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429\": container with ID starting with f7c5246ac3af2ebcf5c12aba072096e504967adbdb3a7de0c5568af1f716f429 not found: ID does not exist" Apr 16 14:03:06.897422 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.897406 2574 scope.go:117] "RemoveContainer" containerID="df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0" Apr 16 14:03:06.897628 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:03:06.897606 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0\": container with ID starting with df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0 not found: ID does not exist" containerID="df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0" Apr 16 14:03:06.897668 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.897633 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0"} err="failed to get container status \"df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0\": rpc error: code = NotFound desc = could not find container \"df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0\": container with ID starting with df764912f45b8624e229066b3ba4449d1217a350803ebb5e875ba2e78b40e9d0 not found: ID does not exist" Apr 16 14:03:06.914034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.913978 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:06.914303 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914289 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:06.914341 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914307 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:06.914341 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914327 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="prometheus" Apr 16 14:03:06.914341 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914336 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="prometheus" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914348 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="thanos-sidecar" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914356 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="thanos-sidecar" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914371 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="init-config-reloader" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914380 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="init-config-reloader" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914389 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-web" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914397 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-web" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914406 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914414 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914426 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" Apr 16 14:03:06.914435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914433 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914441 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="config-reloader" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914449 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="config-reloader" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914499 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="prometheus" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914512 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-thanos" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914522 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b61adc64-b31c-42dc-b211-b78b9427eac1" containerName="registry" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914533 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="thanos-sidecar" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914542 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="config-reloader" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914552 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy-web" Apr 16 14:03:06.914746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.914562 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" containerName="kube-rbac-proxy" Apr 16 14:03:06.918254 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.918235 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.920696 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.920676 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:03:06.920863 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.920843 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-eolrm4d9hr1ql\"" Apr 16 14:03:06.920911 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.920871 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:03:06.921133 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921111 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:03:06.921262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921136 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:03:06.921262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921146 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-mxmxm\"" Apr 16 14:03:06.921262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:03:06.921262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921137 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:03:06.921262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921160 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:03:06.921534 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921519 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:03:06.921769 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921749 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:03:06.921868 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.921754 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:03:06.922101 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.922082 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:03:06.924834 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.924816 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:03:06.928543 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.928525 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:03:06.933797 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.933438 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:06.952339 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952311 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbw9v\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-kube-api-access-bbw9v\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952373 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952418 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952618 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.952712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952705 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.953093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952745 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.953093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952769 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.953093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952791 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.953093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:06.953093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:06.952843 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053202 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053202 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053207 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053224 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053354 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053429 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053558 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053435 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbw9v\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-kube-api-access-bbw9v\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053558 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053558 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053526 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.053710 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.053559 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054191 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054163 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054307 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054239 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054307 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054277 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054419 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054419 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054419 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054419 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054370 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054645 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054417 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054645 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054448 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054645 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.054645 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.054519 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.056646 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.056466 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.056646 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.056531 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.056832 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.056662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.056832 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.056761 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.056832 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.056808 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057207 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057009 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057207 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057058 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057424 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057221 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057515 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057490 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057626 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057606 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.057688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.057607 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.058658 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.058640 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.059077 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.059055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-web-config\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.059434 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.059415 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-config-out\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.059550 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.059536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.066995 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.066976 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbw9v\" (UniqueName: \"kubernetes.io/projected/fbe94934-f8e3-473a-afe1-8ecf9efd63b3-kube-api-access-bbw9v\") pod \"prometheus-k8s-0\" (UID: \"fbe94934-f8e3-473a-afe1-8ecf9efd63b3\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.186779 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.186710 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90c6d285-34cf-4fa2-ae69-870acdbe1454" path="/var/lib/kubelet/pods/90c6d285-34cf-4fa2-ae69-870acdbe1454/volumes" Apr 16 14:03:07.228303 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.228269 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:07.360553 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.360522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:03:07.361045 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:03:07.361022 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbe94934_f8e3_473a_afe1_8ecf9efd63b3.slice/crio-68af75777d7d690e592205b1f4618e36eaa14f5fb4d04dc2ee763c1d9cb0a72a WatchSource:0}: Error finding container 68af75777d7d690e592205b1f4618e36eaa14f5fb4d04dc2ee763c1d9cb0a72a: Status 404 returned error can't find the container with id 68af75777d7d690e592205b1f4618e36eaa14f5fb4d04dc2ee763c1d9cb0a72a Apr 16 14:03:07.852628 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.852593 2574 generic.go:358] "Generic (PLEG): container finished" podID="fbe94934-f8e3-473a-afe1-8ecf9efd63b3" containerID="316c2124a9750a125237e64cd0a79e33ad24dde23ab58f634bb76f00f958b846" exitCode=0 Apr 16 14:03:07.853059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.852684 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerDied","Data":"316c2124a9750a125237e64cd0a79e33ad24dde23ab58f634bb76f00f958b846"} Apr 16 14:03:07.853059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:07.852730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"68af75777d7d690e592205b1f4618e36eaa14f5fb4d04dc2ee763c1d9cb0a72a"} Apr 16 14:03:08.858999 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.858960 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"5e39f102a1c6109d84bc9d582c6757927366bb8ce5c8a4d1e3456f9fb4bd153d"} Apr 16 14:03:08.858999 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.859002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"64294894837cf7f763db0cc576cb102e5022caa0eb2c424009af0b2584da9cc6"} Apr 16 14:03:08.859380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.859014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"6936a2234c271689757534915a440d62e83d44d722d9895da7b30cc420e9e1e3"} Apr 16 14:03:08.859380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.859023 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"e75675798b0907b54fc14589c3fc0090d276de1d3e072fefd3424680f7ab6473"} Apr 16 14:03:08.859380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.859033 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"fb7bd3a73e1ef41882d022865f59f644e4252c803266d76f126800902a21934c"} Apr 16 14:03:08.859380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.859041 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"fbe94934-f8e3-473a-afe1-8ecf9efd63b3","Type":"ContainerStarted","Data":"59945c6fcc83c4ffaa5163af7b6b8f10706d846790513848560c03e426618269"} Apr 16 14:03:08.888737 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:08.888679 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.888659057 podStartE2EDuration="2.888659057s" podCreationTimestamp="2026-04-16 14:03:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:03:08.886376793 +0000 UTC m=+230.247919561" watchObservedRunningTime="2026-04-16 14:03:08.888659057 +0000 UTC m=+230.250201823" Apr 16 14:03:12.228829 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:03:12.228787 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:07.229309 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:04:07.229271 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:07.245768 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:04:07.245738 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:08.038352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:04:08.038320 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:19.068732 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:04:19.068698 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:04:19.069235 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:04:19.068764 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:07:14.893103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.893067 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-nmgjf"] Apr 16 14:07:14.895024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.895004 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:14.897094 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.897072 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 14:07:14.897226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.897144 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:07:14.897780 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.897762 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:07:14.897780 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.897772 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-9pnxd\"" Apr 16 14:07:14.905110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.905091 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nmgjf"] Apr 16 14:07:14.980302 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.980267 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2fc61e58-081f-401b-85c4-70fdec24c552-data\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:14.980487 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:14.980320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbg9m\" (UniqueName: \"kubernetes.io/projected/2fc61e58-081f-401b-85c4-70fdec24c552-kube-api-access-pbg9m\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.080737 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.080703 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pbg9m\" (UniqueName: \"kubernetes.io/projected/2fc61e58-081f-401b-85c4-70fdec24c552-kube-api-access-pbg9m\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.080902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.080766 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2fc61e58-081f-401b-85c4-70fdec24c552-data\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.081103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.081087 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2fc61e58-081f-401b-85c4-70fdec24c552-data\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.089018 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.088992 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbg9m\" (UniqueName: \"kubernetes.io/projected/2fc61e58-081f-401b-85c4-70fdec24c552-kube-api-access-pbg9m\") pod \"seaweedfs-86cc847c5c-nmgjf\" (UID: \"2fc61e58-081f-401b-85c4-70fdec24c552\") " pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.204865 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.204769 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:15.324060 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.323921 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nmgjf"] Apr 16 14:07:15.326619 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:07:15.326570 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fc61e58_081f_401b_85c4_70fdec24c552.slice/crio-11aa462ecb850f8d57bac1919763422b1e0229c048f41dc70a959365cd8d771d WatchSource:0}: Error finding container 11aa462ecb850f8d57bac1919763422b1e0229c048f41dc70a959365cd8d771d: Status 404 returned error can't find the container with id 11aa462ecb850f8d57bac1919763422b1e0229c048f41dc70a959365cd8d771d Apr 16 14:07:15.327790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.327771 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:07:15.506360 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:15.506273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nmgjf" event={"ID":"2fc61e58-081f-401b-85c4-70fdec24c552","Type":"ContainerStarted","Data":"11aa462ecb850f8d57bac1919763422b1e0229c048f41dc70a959365cd8d771d"} Apr 16 14:07:18.516045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:18.516008 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nmgjf" event={"ID":"2fc61e58-081f-401b-85c4-70fdec24c552","Type":"ContainerStarted","Data":"864cc9b896a1af564f72575640983e40511eb70132bbf4d55e97cbb1cfe2e907"} Apr 16 14:07:18.516484 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:18.516178 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:07:18.537850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:18.537798 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-nmgjf" podStartSLOduration=1.506032706 podStartE2EDuration="4.537783252s" podCreationTimestamp="2026-04-16 14:07:14 +0000 UTC" firstStartedPulling="2026-04-16 14:07:15.327895459 +0000 UTC m=+476.689438205" lastFinishedPulling="2026-04-16 14:07:18.359645995 +0000 UTC m=+479.721188751" observedRunningTime="2026-04-16 14:07:18.536621377 +0000 UTC m=+479.898164139" watchObservedRunningTime="2026-04-16 14:07:18.537783252 +0000 UTC m=+479.899326020" Apr 16 14:07:24.521792 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:07:24.521761 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-nmgjf" Apr 16 14:08:24.757347 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.757312 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-crbmt"] Apr 16 14:08:24.760476 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.760460 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:24.763117 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.763093 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 14:08:24.763343 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.763328 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-ct62r\"" Apr 16 14:08:24.773072 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.773047 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-crbmt"] Apr 16 14:08:24.774393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.774374 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-md2fm"] Apr 16 14:08:24.777475 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.777455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:24.779473 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.779453 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 14:08:24.779602 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.779519 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-fshp4\"" Apr 16 14:08:24.783895 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.783873 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-md2fm"] Apr 16 14:08:24.809280 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.809243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gsz7\" (UniqueName: \"kubernetes.io/projected/fa946f5a-8221-4ddb-818c-316b3ef0afa2-kube-api-access-6gsz7\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:24.809450 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.809321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:24.910196 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.910152 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:24.910367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.910244 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gsz7\" (UniqueName: \"kubernetes.io/projected/fa946f5a-8221-4ddb-818c-316b3ef0afa2-kube-api-access-6gsz7\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:24.910367 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:08:24.910309 2574 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 14:08:24.910367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.910292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86603922-7ef5-4db6-bca9-3a12f847824c-cert\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:24.910865 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.910424 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7f9\" (UniqueName: \"kubernetes.io/projected/86603922-7ef5-4db6-bca9-3a12f847824c-kube-api-access-vf7f9\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:24.911010 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:08:24.910881 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs podName:fa946f5a-8221-4ddb-818c-316b3ef0afa2 nodeName:}" failed. No retries permitted until 2026-04-16 14:08:25.410851036 +0000 UTC m=+546.772393799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs") pod "model-serving-api-86f7b4b499-crbmt" (UID: "fa946f5a-8221-4ddb-818c-316b3ef0afa2") : secret "model-serving-api-tls" not found Apr 16 14:08:24.919459 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:24.919432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gsz7\" (UniqueName: \"kubernetes.io/projected/fa946f5a-8221-4ddb-818c-316b3ef0afa2-kube-api-access-6gsz7\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:25.011354 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.011266 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86603922-7ef5-4db6-bca9-3a12f847824c-cert\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:25.011354 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.011304 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7f9\" (UniqueName: \"kubernetes.io/projected/86603922-7ef5-4db6-bca9-3a12f847824c-kube-api-access-vf7f9\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:25.013723 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.013697 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/86603922-7ef5-4db6-bca9-3a12f847824c-cert\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:25.020528 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.020502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7f9\" (UniqueName: \"kubernetes.io/projected/86603922-7ef5-4db6-bca9-3a12f847824c-kube-api-access-vf7f9\") pod \"odh-model-controller-696fc77849-md2fm\" (UID: \"86603922-7ef5-4db6-bca9-3a12f847824c\") " pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:25.088776 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.088736 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:25.204700 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.204666 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-md2fm"] Apr 16 14:08:25.207496 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:25.207466 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86603922_7ef5_4db6_bca9_3a12f847824c.slice/crio-3a748ace322ea25c2462d6a2222fc7a66817c0742892b3a97450aca0eb62b4f5 WatchSource:0}: Error finding container 3a748ace322ea25c2462d6a2222fc7a66817c0742892b3a97450aca0eb62b4f5: Status 404 returned error can't find the container with id 3a748ace322ea25c2462d6a2222fc7a66817c0742892b3a97450aca0eb62b4f5 Apr 16 14:08:25.415368 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.415322 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:25.415616 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:08:25.415480 2574 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 14:08:25.415616 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:08:25.415554 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs podName:fa946f5a-8221-4ddb-818c-316b3ef0afa2 nodeName:}" failed. No retries permitted until 2026-04-16 14:08:26.415533989 +0000 UTC m=+547.777076743 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs") pod "model-serving-api-86f7b4b499-crbmt" (UID: "fa946f5a-8221-4ddb-818c-316b3ef0afa2") : secret "model-serving-api-tls" not found Apr 16 14:08:25.700733 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:25.700642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-md2fm" event={"ID":"86603922-7ef5-4db6-bca9-3a12f847824c","Type":"ContainerStarted","Data":"3a748ace322ea25c2462d6a2222fc7a66817c0742892b3a97450aca0eb62b4f5"} Apr 16 14:08:26.424215 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:26.424172 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:26.427764 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:26.427732 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fa946f5a-8221-4ddb-818c-316b3ef0afa2-tls-certs\") pod \"model-serving-api-86f7b4b499-crbmt\" (UID: \"fa946f5a-8221-4ddb-818c-316b3ef0afa2\") " pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:26.569984 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:26.569950 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:26.717624 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:26.717525 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-crbmt"] Apr 16 14:08:27.844229 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:27.844192 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa946f5a_8221_4ddb_818c_316b3ef0afa2.slice/crio-7a499e033f5dcdf734ae124e45760151b68a34b96c113c02fd01f5be79a3caba WatchSource:0}: Error finding container 7a499e033f5dcdf734ae124e45760151b68a34b96c113c02fd01f5be79a3caba: Status 404 returned error can't find the container with id 7a499e033f5dcdf734ae124e45760151b68a34b96c113c02fd01f5be79a3caba Apr 16 14:08:28.711253 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:28.711201 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-md2fm" event={"ID":"86603922-7ef5-4db6-bca9-3a12f847824c","Type":"ContainerStarted","Data":"fe02d5fde2c2e226b098f769d983083d02fa05912d446b93374a542db98ef8ae"} Apr 16 14:08:28.711442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:28.711303 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:28.712691 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:28.712659 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-crbmt" event={"ID":"fa946f5a-8221-4ddb-818c-316b3ef0afa2","Type":"ContainerStarted","Data":"7a499e033f5dcdf734ae124e45760151b68a34b96c113c02fd01f5be79a3caba"} Apr 16 14:08:28.726899 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:28.726846 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-md2fm" podStartSLOduration=2.0396948520000002 podStartE2EDuration="4.726825415s" podCreationTimestamp="2026-04-16 14:08:24 +0000 UTC" firstStartedPulling="2026-04-16 14:08:25.208657111 +0000 UTC m=+546.570199857" lastFinishedPulling="2026-04-16 14:08:27.895787666 +0000 UTC m=+549.257330420" observedRunningTime="2026-04-16 14:08:28.724967125 +0000 UTC m=+550.086509894" watchObservedRunningTime="2026-04-16 14:08:28.726825415 +0000 UTC m=+550.088368184" Apr 16 14:08:30.723464 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:30.723421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-crbmt" event={"ID":"fa946f5a-8221-4ddb-818c-316b3ef0afa2","Type":"ContainerStarted","Data":"2f76fa5a10229af4b0b024c52a280f2cd079d7e8b48a8e47da57bafd117a43e4"} Apr 16 14:08:30.723902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:30.723569 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:30.741138 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:30.741080 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-crbmt" podStartSLOduration=4.356101297 podStartE2EDuration="6.741063151s" podCreationTimestamp="2026-04-16 14:08:24 +0000 UTC" firstStartedPulling="2026-04-16 14:08:27.845900906 +0000 UTC m=+549.207443655" lastFinishedPulling="2026-04-16 14:08:30.23086276 +0000 UTC m=+551.592405509" observedRunningTime="2026-04-16 14:08:30.738902091 +0000 UTC m=+552.100444859" watchObservedRunningTime="2026-04-16 14:08:30.741063151 +0000 UTC m=+552.102605921" Apr 16 14:08:39.717902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:39.717872 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-md2fm" Apr 16 14:08:40.566721 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.566685 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-62m6k"] Apr 16 14:08:40.569536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.569518 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-62m6k" Apr 16 14:08:40.576883 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.576854 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-62m6k"] Apr 16 14:08:40.743179 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.743138 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m56hq\" (UniqueName: \"kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq\") pod \"s3-init-62m6k\" (UID: \"bcfb22cf-a086-4275-a00d-092421ff7fc7\") " pod="kserve/s3-init-62m6k" Apr 16 14:08:40.843938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.843904 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m56hq\" (UniqueName: \"kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq\") pod \"s3-init-62m6k\" (UID: \"bcfb22cf-a086-4275-a00d-092421ff7fc7\") " pod="kserve/s3-init-62m6k" Apr 16 14:08:40.852772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.852740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m56hq\" (UniqueName: \"kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq\") pod \"s3-init-62m6k\" (UID: \"bcfb22cf-a086-4275-a00d-092421ff7fc7\") " pod="kserve/s3-init-62m6k" Apr 16 14:08:40.889647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:40.889598 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-62m6k" Apr 16 14:08:41.005140 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:41.005102 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-62m6k"] Apr 16 14:08:41.008093 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:41.008066 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbcfb22cf_a086_4275_a00d_092421ff7fc7.slice/crio-b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660 WatchSource:0}: Error finding container b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660: Status 404 returned error can't find the container with id b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660 Apr 16 14:08:41.732532 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:41.732498 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-crbmt" Apr 16 14:08:41.756318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:41.756050 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-62m6k" event={"ID":"bcfb22cf-a086-4275-a00d-092421ff7fc7","Type":"ContainerStarted","Data":"b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660"} Apr 16 14:08:45.769848 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:45.769761 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-62m6k" event={"ID":"bcfb22cf-a086-4275-a00d-092421ff7fc7","Type":"ContainerStarted","Data":"94b74acbd203eecc3661c7f42780d32efde6782476054dc777f9ef4791aa9a2d"} Apr 16 14:08:45.783759 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:45.783712 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-62m6k" podStartSLOduration=1.304540431 podStartE2EDuration="5.783696883s" podCreationTimestamp="2026-04-16 14:08:40 +0000 UTC" firstStartedPulling="2026-04-16 14:08:41.009886493 +0000 UTC m=+562.371429239" lastFinishedPulling="2026-04-16 14:08:45.489042928 +0000 UTC m=+566.850585691" observedRunningTime="2026-04-16 14:08:45.78253404 +0000 UTC m=+567.144076808" watchObservedRunningTime="2026-04-16 14:08:45.783696883 +0000 UTC m=+567.145239650" Apr 16 14:08:48.783380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:48.783344 2574 generic.go:358] "Generic (PLEG): container finished" podID="bcfb22cf-a086-4275-a00d-092421ff7fc7" containerID="94b74acbd203eecc3661c7f42780d32efde6782476054dc777f9ef4791aa9a2d" exitCode=0 Apr 16 14:08:48.783759 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:48.783422 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-62m6k" event={"ID":"bcfb22cf-a086-4275-a00d-092421ff7fc7","Type":"ContainerDied","Data":"94b74acbd203eecc3661c7f42780d32efde6782476054dc777f9ef4791aa9a2d"} Apr 16 14:08:49.909814 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:49.909792 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-62m6k" Apr 16 14:08:50.019457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.019422 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m56hq\" (UniqueName: \"kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq\") pod \"bcfb22cf-a086-4275-a00d-092421ff7fc7\" (UID: \"bcfb22cf-a086-4275-a00d-092421ff7fc7\") " Apr 16 14:08:50.021609 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.021566 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq" (OuterVolumeSpecName: "kube-api-access-m56hq") pod "bcfb22cf-a086-4275-a00d-092421ff7fc7" (UID: "bcfb22cf-a086-4275-a00d-092421ff7fc7"). InnerVolumeSpecName "kube-api-access-m56hq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:50.120921 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.120888 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m56hq\" (UniqueName: \"kubernetes.io/projected/bcfb22cf-a086-4275-a00d-092421ff7fc7-kube-api-access-m56hq\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:08:50.788941 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.788910 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-62m6k" Apr 16 14:08:50.789110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.788914 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-62m6k" event={"ID":"bcfb22cf-a086-4275-a00d-092421ff7fc7","Type":"ContainerDied","Data":"b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660"} Apr 16 14:08:50.789110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:50.789022 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b2841a3a6aa8a79502105db42a42510eadee5ab9eb68746a84a2993b5e1c1660" Apr 16 14:08:51.568136 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.568103 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:51.568506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.568371 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bcfb22cf-a086-4275-a00d-092421ff7fc7" containerName="s3-init" Apr 16 14:08:51.568506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.568381 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcfb22cf-a086-4275-a00d-092421ff7fc7" containerName="s3-init" Apr 16 14:08:51.568506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.568432 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bcfb22cf-a086-4275-a00d-092421ff7fc7" containerName="s3-init" Apr 16 14:08:51.571670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.571649 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.574316 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.574296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 14:08:51.576501 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.576479 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:51.735604 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.735546 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2wjq\" (UniqueName: \"kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.735772 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.735654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.836420 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.836384 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.836636 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.836438 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2wjq\" (UniqueName: \"kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.836830 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.836810 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.844041 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.844019 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2wjq\" (UniqueName: \"kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq\") pod \"seaweedfs-tls-custom-ddd4dbfd-p2zm8\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:51.881723 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:51.881687 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:52.015193 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:52.015042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:52.017797 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:52.017770 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9941d594_3213_4492_9366_f7f229f48620.slice/crio-3e0a806e2c12f22c486e410c231334e644f1fcf5ba657b99ee62fe8d5745ea69 WatchSource:0}: Error finding container 3e0a806e2c12f22c486e410c231334e644f1fcf5ba657b99ee62fe8d5745ea69: Status 404 returned error can't find the container with id 3e0a806e2c12f22c486e410c231334e644f1fcf5ba657b99ee62fe8d5745ea69 Apr 16 14:08:52.796124 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:52.796092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" event={"ID":"9941d594-3213-4492-9366-f7f229f48620","Type":"ContainerStarted","Data":"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f"} Apr 16 14:08:52.796124 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:52.796128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" event={"ID":"9941d594-3213-4492-9366-f7f229f48620","Type":"ContainerStarted","Data":"3e0a806e2c12f22c486e410c231334e644f1fcf5ba657b99ee62fe8d5745ea69"} Apr 16 14:08:52.810835 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:52.810780 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" podStartSLOduration=1.569669294 podStartE2EDuration="1.810760512s" podCreationTimestamp="2026-04-16 14:08:51 +0000 UTC" firstStartedPulling="2026-04-16 14:08:52.019153403 +0000 UTC m=+573.380696148" lastFinishedPulling="2026-04-16 14:08:52.260244621 +0000 UTC m=+573.621787366" observedRunningTime="2026-04-16 14:08:52.809900607 +0000 UTC m=+574.171443377" watchObservedRunningTime="2026-04-16 14:08:52.810760512 +0000 UTC m=+574.172303280" Apr 16 14:08:54.445873 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:54.445841 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:54.801891 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:54.801801 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" podUID="9941d594-3213-4492-9366-f7f229f48620" containerName="seaweedfs-tls-custom" containerID="cri-o://039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f" gracePeriod=30 Apr 16 14:08:56.046796 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.046770 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:56.171030 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.170994 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2wjq\" (UniqueName: \"kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq\") pod \"9941d594-3213-4492-9366-f7f229f48620\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " Apr 16 14:08:56.171214 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.171063 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data\") pod \"9941d594-3213-4492-9366-f7f229f48620\" (UID: \"9941d594-3213-4492-9366-f7f229f48620\") " Apr 16 14:08:56.172357 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.172336 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data" (OuterVolumeSpecName: "data") pod "9941d594-3213-4492-9366-f7f229f48620" (UID: "9941d594-3213-4492-9366-f7f229f48620"). InnerVolumeSpecName "data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:08:56.173078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.173059 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq" (OuterVolumeSpecName: "kube-api-access-x2wjq") pod "9941d594-3213-4492-9366-f7f229f48620" (UID: "9941d594-3213-4492-9366-f7f229f48620"). InnerVolumeSpecName "kube-api-access-x2wjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:08:56.271622 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.271554 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x2wjq\" (UniqueName: \"kubernetes.io/projected/9941d594-3213-4492-9366-f7f229f48620-kube-api-access-x2wjq\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:08:56.271622 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.271617 2574 reconciler_common.go:299] "Volume detached for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/9941d594-3213-4492-9366-f7f229f48620-data\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:08:56.808162 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.808126 2574 generic.go:358] "Generic (PLEG): container finished" podID="9941d594-3213-4492-9366-f7f229f48620" containerID="039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f" exitCode=0 Apr 16 14:08:56.808330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.808210 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" Apr 16 14:08:56.808330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.808209 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" event={"ID":"9941d594-3213-4492-9366-f7f229f48620","Type":"ContainerDied","Data":"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f"} Apr 16 14:08:56.808330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.808249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8" event={"ID":"9941d594-3213-4492-9366-f7f229f48620","Type":"ContainerDied","Data":"3e0a806e2c12f22c486e410c231334e644f1fcf5ba657b99ee62fe8d5745ea69"} Apr 16 14:08:56.808330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.808264 2574 scope.go:117] "RemoveContainer" containerID="039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f" Apr 16 14:08:56.816950 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.816931 2574 scope.go:117] "RemoveContainer" containerID="039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f" Apr 16 14:08:56.817226 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:08:56.817198 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f\": container with ID starting with 039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f not found: ID does not exist" containerID="039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f" Apr 16 14:08:56.817291 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.817234 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f"} err="failed to get container status \"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f\": rpc error: code = NotFound desc = could not find container \"039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f\": container with ID starting with 039fcae474ca651a17f7e5c84734ae67a13daceec0039e23377c4095b2d1557f not found: ID does not exist" Apr 16 14:08:56.828059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.828034 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:56.831149 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.831126 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/seaweedfs-tls-custom-ddd4dbfd-p2zm8"] Apr 16 14:08:56.858014 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.857987 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n"] Apr 16 14:08:56.858288 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.858276 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9941d594-3213-4492-9366-f7f229f48620" containerName="seaweedfs-tls-custom" Apr 16 14:08:56.858330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.858290 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9941d594-3213-4492-9366-f7f229f48620" containerName="seaweedfs-tls-custom" Apr 16 14:08:56.858375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.858366 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9941d594-3213-4492-9366-f7f229f48620" containerName="seaweedfs-tls-custom" Apr 16 14:08:56.862761 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.862745 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.864609 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.864592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 14:08:56.864702 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.864624 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom\"" Apr 16 14:08:56.867362 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.867340 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n"] Apr 16 14:08:56.877077 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.877048 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.877184 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.877140 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/37a7a911-e94b-438e-ad0f-8c3a2988996d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.877184 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.877178 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6s89\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-kube-api-access-l6s89\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.977539 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.977502 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.977746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.977555 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/37a7a911-e94b-438e-ad0f-8c3a2988996d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.977746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.977696 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l6s89\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-kube-api-access-l6s89\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.978075 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.978057 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/37a7a911-e94b-438e-ad0f-8c3a2988996d-data\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.980087 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.980063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-custom\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-seaweedfs-tls-custom\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:56.986735 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:56.986713 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6s89\" (UniqueName: \"kubernetes.io/projected/37a7a911-e94b-438e-ad0f-8c3a2988996d-kube-api-access-l6s89\") pod \"seaweedfs-tls-custom-5c88b85bb7-8vc7n\" (UID: \"37a7a911-e94b-438e-ad0f-8c3a2988996d\") " pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:57.172690 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.172651 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" Apr 16 14:08:57.187393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.187363 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9941d594-3213-4492-9366-f7f229f48620" path="/var/lib/kubelet/pods/9941d594-3213-4492-9366-f7f229f48620/volumes" Apr 16 14:08:57.292432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.292393 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n"] Apr 16 14:08:57.295348 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:57.295315 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a7a911_e94b_438e_ad0f_8c3a2988996d.slice/crio-52a80abfee2af2fe6699dd5ab1a3e4d96dca02449b725d1c5a09d0dfd84a29fd WatchSource:0}: Error finding container 52a80abfee2af2fe6699dd5ab1a3e4d96dca02449b725d1c5a09d0dfd84a29fd: Status 404 returned error can't find the container with id 52a80abfee2af2fe6699dd5ab1a3e4d96dca02449b725d1c5a09d0dfd84a29fd Apr 16 14:08:57.813896 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.813858 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" event={"ID":"37a7a911-e94b-438e-ad0f-8c3a2988996d","Type":"ContainerStarted","Data":"8ae867b7a05e974266ec8f5e89f67d20abc0572ecfb8cd5cf0dc5cf7a680c7d3"} Apr 16 14:08:57.813896 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.813899 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" event={"ID":"37a7a911-e94b-438e-ad0f-8c3a2988996d","Type":"ContainerStarted","Data":"52a80abfee2af2fe6699dd5ab1a3e4d96dca02449b725d1c5a09d0dfd84a29fd"} Apr 16 14:08:57.830592 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:57.830525 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-custom-5c88b85bb7-8vc7n" podStartSLOduration=1.470818392 podStartE2EDuration="1.830508833s" podCreationTimestamp="2026-04-16 14:08:56 +0000 UTC" firstStartedPulling="2026-04-16 14:08:57.296661449 +0000 UTC m=+578.658204196" lastFinishedPulling="2026-04-16 14:08:57.656351876 +0000 UTC m=+579.017894637" observedRunningTime="2026-04-16 14:08:57.828349973 +0000 UTC m=+579.189892741" watchObservedRunningTime="2026-04-16 14:08:57.830508833 +0000 UTC m=+579.192051601" Apr 16 14:08:58.176951 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.176917 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-2qngm"] Apr 16 14:08:58.180825 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.180797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:08:58.185289 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.185258 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxpb2\" (UniqueName: \"kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2\") pod \"s3-tls-init-custom-2qngm\" (UID: \"5e0fc195-7fd2-4efd-be4a-4603e88e634d\") " pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:08:58.222986 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.222952 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-2qngm"] Apr 16 14:08:58.286119 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.286080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xxpb2\" (UniqueName: \"kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2\") pod \"s3-tls-init-custom-2qngm\" (UID: \"5e0fc195-7fd2-4efd-be4a-4603e88e634d\") " pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:08:58.299560 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.299520 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxpb2\" (UniqueName: \"kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2\") pod \"s3-tls-init-custom-2qngm\" (UID: \"5e0fc195-7fd2-4efd-be4a-4603e88e634d\") " pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:08:58.504575 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.504480 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:08:58.623901 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.623875 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-2qngm"] Apr 16 14:08:58.626748 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:08:58.626722 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e0fc195_7fd2_4efd_be4a_4603e88e634d.slice/crio-b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85 WatchSource:0}: Error finding container b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85: Status 404 returned error can't find the container with id b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85 Apr 16 14:08:58.818540 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.818433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2qngm" event={"ID":"5e0fc195-7fd2-4efd-be4a-4603e88e634d","Type":"ContainerStarted","Data":"b0a691dfdac730d40df514a51575def74b30e97d2fb5e3cc3804c4724196c348"} Apr 16 14:08:58.818540 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.818478 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2qngm" event={"ID":"5e0fc195-7fd2-4efd-be4a-4603e88e634d","Type":"ContainerStarted","Data":"b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85"} Apr 16 14:08:58.834372 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:08:58.834304 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-2qngm" podStartSLOduration=0.834280538 podStartE2EDuration="834.280538ms" podCreationTimestamp="2026-04-16 14:08:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:08:58.832549226 +0000 UTC m=+580.194091994" watchObservedRunningTime="2026-04-16 14:08:58.834280538 +0000 UTC m=+580.195823348" Apr 16 14:09:04.842019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:04.841985 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e0fc195-7fd2-4efd-be4a-4603e88e634d" containerID="b0a691dfdac730d40df514a51575def74b30e97d2fb5e3cc3804c4724196c348" exitCode=0 Apr 16 14:09:04.842428 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:04.842047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2qngm" event={"ID":"5e0fc195-7fd2-4efd-be4a-4603e88e634d","Type":"ContainerDied","Data":"b0a691dfdac730d40df514a51575def74b30e97d2fb5e3cc3804c4724196c348"} Apr 16 14:09:05.964421 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:05.964400 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:09:06.043931 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.043888 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxpb2\" (UniqueName: \"kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2\") pod \"5e0fc195-7fd2-4efd-be4a-4603e88e634d\" (UID: \"5e0fc195-7fd2-4efd-be4a-4603e88e634d\") " Apr 16 14:09:06.045963 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.045934 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2" (OuterVolumeSpecName: "kube-api-access-xxpb2") pod "5e0fc195-7fd2-4efd-be4a-4603e88e634d" (UID: "5e0fc195-7fd2-4efd-be4a-4603e88e634d"). InnerVolumeSpecName "kube-api-access-xxpb2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:06.144691 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.144572 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xxpb2\" (UniqueName: \"kubernetes.io/projected/5e0fc195-7fd2-4efd-be4a-4603e88e634d-kube-api-access-xxpb2\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:09:06.847791 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.847752 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-2qngm" event={"ID":"5e0fc195-7fd2-4efd-be4a-4603e88e634d","Type":"ContainerDied","Data":"b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85"} Apr 16 14:09:06.847791 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.847792 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8c9b238c72ecc415ffdd2ba9f18ffe7025d604ee0c35457a3086503baed8f85" Apr 16 14:09:06.847991 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:06.847793 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-2qngm" Apr 16 14:09:07.555636 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.555600 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf"] Apr 16 14:09:07.556024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.555875 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e0fc195-7fd2-4efd-be4a-4603e88e634d" containerName="s3-tls-init-custom" Apr 16 14:09:07.556024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.555886 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e0fc195-7fd2-4efd-be4a-4603e88e634d" containerName="s3-tls-init-custom" Apr 16 14:09:07.556024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.555936 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e0fc195-7fd2-4efd-be4a-4603e88e634d" containerName="s3-tls-init-custom" Apr 16 14:09:07.559079 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.559061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.561937 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.561911 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving\"" Apr 16 14:09:07.561937 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.561911 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 14:09:07.574790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.574764 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf"] Apr 16 14:09:07.659078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.659036 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09ee73fc-a005-4eb2-833c-86a8deb2d48d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.659078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.659078 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.659292 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.659103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5swdp\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-kube-api-access-5swdp\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.759769 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.759739 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09ee73fc-a005-4eb2-833c-86a8deb2d48d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.759940 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.759774 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.759940 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.759798 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5swdp\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-kube-api-access-5swdp\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.760171 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.760147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/09ee73fc-a005-4eb2-833c-86a8deb2d48d-data\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.762217 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.762189 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"seaweedfs-tls-serving\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-seaweedfs-tls-serving\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.768480 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.768457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5swdp\" (UniqueName: \"kubernetes.io/projected/09ee73fc-a005-4eb2-833c-86a8deb2d48d-kube-api-access-5swdp\") pod \"seaweedfs-tls-serving-7fd5766db9-w9jmf\" (UID: \"09ee73fc-a005-4eb2-833c-86a8deb2d48d\") " pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.867920 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.867887 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" Apr 16 14:09:07.989232 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:07.989189 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf"] Apr 16 14:09:07.992717 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:09:07.992690 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ee73fc_a005_4eb2_833c_86a8deb2d48d.slice/crio-6f2868598dcc565b906366090867850f9cbffdc4981d5bd377c8c1481800abd8 WatchSource:0}: Error finding container 6f2868598dcc565b906366090867850f9cbffdc4981d5bd377c8c1481800abd8: Status 404 returned error can't find the container with id 6f2868598dcc565b906366090867850f9cbffdc4981d5bd377c8c1481800abd8 Apr 16 14:09:08.853942 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:08.853911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" event={"ID":"09ee73fc-a005-4eb2-833c-86a8deb2d48d","Type":"ContainerStarted","Data":"87ca40087f645763e27a93b7800e9eaefec4dadf66a58f369bd7fa6648fec288"} Apr 16 14:09:08.853942 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:08.853943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" event={"ID":"09ee73fc-a005-4eb2-833c-86a8deb2d48d","Type":"ContainerStarted","Data":"6f2868598dcc565b906366090867850f9cbffdc4981d5bd377c8c1481800abd8"} Apr 16 14:09:08.869740 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:08.869693 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-tls-serving-7fd5766db9-w9jmf" podStartSLOduration=1.5993445560000001 podStartE2EDuration="1.869677756s" podCreationTimestamp="2026-04-16 14:09:07 +0000 UTC" firstStartedPulling="2026-04-16 14:09:07.99429525 +0000 UTC m=+589.355838000" lastFinishedPulling="2026-04-16 14:09:08.26462845 +0000 UTC m=+589.626171200" observedRunningTime="2026-04-16 14:09:08.867981163 +0000 UTC m=+590.229523933" watchObservedRunningTime="2026-04-16 14:09:08.869677756 +0000 UTC m=+590.231220524" Apr 16 14:09:09.472573 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.472540 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-799wj"] Apr 16 14:09:09.476536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.476519 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:09.484912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.484884 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-799wj"] Apr 16 14:09:09.572618 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.572575 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4bjl\" (UniqueName: \"kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl\") pod \"s3-tls-init-serving-799wj\" (UID: \"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06\") " pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:09.673261 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.673226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4bjl\" (UniqueName: \"kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl\") pod \"s3-tls-init-serving-799wj\" (UID: \"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06\") " pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:09.680962 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.680942 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4bjl\" (UniqueName: \"kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl\") pod \"s3-tls-init-serving-799wj\" (UID: \"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06\") " pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:09.799705 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.799606 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:09.920816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:09.920790 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-799wj"] Apr 16 14:09:09.923172 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:09:09.923140 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6eee4fd4_65cd_4ea6_a004_5bc8ff84ba06.slice/crio-66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d WatchSource:0}: Error finding container 66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d: Status 404 returned error can't find the container with id 66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d Apr 16 14:09:10.865534 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:10.865502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-799wj" event={"ID":"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06","Type":"ContainerStarted","Data":"2e3d936158330a2b21af6f5ac65485d149e1dca930b9147c461f047ba32a30eb"} Apr 16 14:09:10.865534 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:10.865535 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-799wj" event={"ID":"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06","Type":"ContainerStarted","Data":"66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d"} Apr 16 14:09:14.878251 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:14.878213 2574 generic.go:358] "Generic (PLEG): container finished" podID="6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06" containerID="2e3d936158330a2b21af6f5ac65485d149e1dca930b9147c461f047ba32a30eb" exitCode=0 Apr 16 14:09:14.878729 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:14.878291 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-799wj" event={"ID":"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06","Type":"ContainerDied","Data":"2e3d936158330a2b21af6f5ac65485d149e1dca930b9147c461f047ba32a30eb"} Apr 16 14:09:16.004960 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.004938 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:16.120597 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.120555 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4bjl\" (UniqueName: \"kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl\") pod \"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06\" (UID: \"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06\") " Apr 16 14:09:16.122652 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.122627 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl" (OuterVolumeSpecName: "kube-api-access-g4bjl") pod "6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06" (UID: "6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06"). InnerVolumeSpecName "kube-api-access-g4bjl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:09:16.221938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.221859 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g4bjl\" (UniqueName: \"kubernetes.io/projected/6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06-kube-api-access-g4bjl\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:09:16.885311 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.885281 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-799wj" Apr 16 14:09:16.885311 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.885290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-799wj" event={"ID":"6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06","Type":"ContainerDied","Data":"66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d"} Apr 16 14:09:16.885311 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:16.885315 2574 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66e6d39c0d9217194214469c234f3979f966b27984bdc86e55dcaa906e89621d" Apr 16 14:09:19.089057 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:19.089021 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:09:19.089835 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:19.089816 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:09:27.862150 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.862114 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:09:27.862647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.862404 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06" containerName="s3-tls-init-serving" Apr 16 14:09:27.862647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.862415 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06" containerName="s3-tls-init-serving" Apr 16 14:09:27.862647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.862468 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06" containerName="s3-tls-init-serving" Apr 16 14:09:27.881845 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.881814 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:09:27.882009 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.881969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:27.884145 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.884120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdtpg\"" Apr 16 14:09:27.921824 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:27.921794 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt\" (UID: \"652dae22-4e3c-4697-8389-511dcaaa0f92\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:28.022449 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:28.022408 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt\" (UID: \"652dae22-4e3c-4697-8389-511dcaaa0f92\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:28.022786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:28.022768 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt\" (UID: \"652dae22-4e3c-4697-8389-511dcaaa0f92\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:28.191526 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:28.191439 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:28.313003 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:28.312943 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:09:28.316946 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:09:28.316919 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod652dae22_4e3c_4697_8389_511dcaaa0f92.slice/crio-0b59c3150e09d797acc7304cfd1dc3c4371162d357decceeb4016e89db95edae WatchSource:0}: Error finding container 0b59c3150e09d797acc7304cfd1dc3c4371162d357decceeb4016e89db95edae: Status 404 returned error can't find the container with id 0b59c3150e09d797acc7304cfd1dc3c4371162d357decceeb4016e89db95edae Apr 16 14:09:28.922996 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:28.922954 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerStarted","Data":"0b59c3150e09d797acc7304cfd1dc3c4371162d357decceeb4016e89db95edae"} Apr 16 14:09:31.934113 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:31.934075 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerStarted","Data":"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab"} Apr 16 14:09:35.950920 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:35.950879 2574 generic.go:358] "Generic (PLEG): container finished" podID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerID="71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab" exitCode=0 Apr 16 14:09:35.951360 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:35.950953 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerDied","Data":"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab"} Apr 16 14:09:49.998907 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:49.998869 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerStarted","Data":"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783"} Apr 16 14:09:52.006237 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.006197 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerStarted","Data":"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181"} Apr 16 14:09:52.006651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.006408 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:52.006651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.006437 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:09:52.007975 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.007938 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:09:52.008592 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.008556 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:09:52.025411 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:52.025368 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podStartSLOduration=1.8463767660000001 podStartE2EDuration="25.025334132s" podCreationTimestamp="2026-04-16 14:09:27 +0000 UTC" firstStartedPulling="2026-04-16 14:09:28.318913602 +0000 UTC m=+609.680456348" lastFinishedPulling="2026-04-16 14:09:51.497870965 +0000 UTC m=+632.859413714" observedRunningTime="2026-04-16 14:09:52.024866594 +0000 UTC m=+633.386409362" watchObservedRunningTime="2026-04-16 14:09:52.025334132 +0000 UTC m=+633.386876898" Apr 16 14:09:53.009598 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:53.009547 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:09:53.010136 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:09:53.009925 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:03.009761 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:03.009710 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:03.024427 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:03.010186 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:13.009977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:13.009927 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:13.010394 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:13.010369 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:23.010441 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:23.010391 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:23.010984 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:23.010819 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:33.010465 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:33.010413 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:33.010919 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:33.010887 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:43.010280 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:43.010230 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:43.010756 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:43.010712 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:53.010518 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:53.010467 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:10:53.010947 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:53.010918 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:55.187202 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:55.187174 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:10:55.187651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:10:55.187226 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:11:02.907737 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:02.907700 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:11:02.908270 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:02.908046 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" containerID="cri-o://52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783" gracePeriod=30 Apr 16 14:11:02.908342 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:02.908296 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" containerID="cri-o://a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181" gracePeriod=30 Apr 16 14:11:03.013266 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.013232 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:11:03.016821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.016801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:03.026053 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.026022 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:11:03.107798 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.107756 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp\" (UID: \"b92cbcff-4baf-425a-9b31-c0d2098836d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:03.208730 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.208654 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp\" (UID: \"b92cbcff-4baf-425a-9b31-c0d2098836d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:03.209020 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.209000 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp\" (UID: \"b92cbcff-4baf-425a-9b31-c0d2098836d6\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:03.328098 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.328060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:03.452876 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:03.452842 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:11:03.456320 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:11:03.456292 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb92cbcff_4baf_425a_9b31_c0d2098836d6.slice/crio-fa0c46d89e6fbcdc9dccbfe6b0f86131dba69021ff8ef6d75ad9782d7d1eb6c4 WatchSource:0}: Error finding container fa0c46d89e6fbcdc9dccbfe6b0f86131dba69021ff8ef6d75ad9782d7d1eb6c4: Status 404 returned error can't find the container with id fa0c46d89e6fbcdc9dccbfe6b0f86131dba69021ff8ef6d75ad9782d7d1eb6c4 Apr 16 14:11:04.206065 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:04.206027 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerStarted","Data":"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b"} Apr 16 14:11:04.206065 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:04.206066 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerStarted","Data":"fa0c46d89e6fbcdc9dccbfe6b0f86131dba69021ff8ef6d75ad9782d7d1eb6c4"} Apr 16 14:11:05.184080 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:05.184036 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:11:05.184338 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:05.184311 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:08.220018 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:08.219982 2574 generic.go:358] "Generic (PLEG): container finished" podID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerID="52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783" exitCode=0 Apr 16 14:11:08.220463 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:08.220054 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerDied","Data":"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783"} Apr 16 14:11:08.221167 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:08.221146 2574 generic.go:358] "Generic (PLEG): container finished" podID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerID="7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b" exitCode=0 Apr 16 14:11:08.221287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:08.221179 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerDied","Data":"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b"} Apr 16 14:11:09.225321 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.225287 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerStarted","Data":"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb"} Apr 16 14:11:09.225321 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.225324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerStarted","Data":"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557"} Apr 16 14:11:09.225811 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.225752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:09.225811 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.225785 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:11:09.227204 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.227177 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:09.227873 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.227852 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:09.243953 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:09.243876 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podStartSLOduration=7.243861882 podStartE2EDuration="7.243861882s" podCreationTimestamp="2026-04-16 14:11:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:11:09.242157901 +0000 UTC m=+710.603700670" watchObservedRunningTime="2026-04-16 14:11:09.243861882 +0000 UTC m=+710.605404651" Apr 16 14:11:10.228632 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:10.228598 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:10.229025 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:10.228916 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:15.184172 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:15.184124 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:11:15.184616 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:15.184557 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:20.229390 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:20.229338 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:20.229988 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:20.229963 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:25.183487 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:25.183437 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.25:8080: connect: connection refused" Apr 16 14:11:25.183959 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:25.183791 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:25.186696 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:25.186677 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:11:25.186810 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:25.186722 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:11:30.228681 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:30.228639 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:30.229150 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:30.229105 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:33.044883 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.044859 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:11:33.145119 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.145085 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location\") pod \"652dae22-4e3c-4697-8389-511dcaaa0f92\" (UID: \"652dae22-4e3c-4697-8389-511dcaaa0f92\") " Apr 16 14:11:33.145439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.145411 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "652dae22-4e3c-4697-8389-511dcaaa0f92" (UID: "652dae22-4e3c-4697-8389-511dcaaa0f92"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:11:33.245766 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.245733 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/652dae22-4e3c-4697-8389-511dcaaa0f92-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:11:33.295069 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.295028 2574 generic.go:358] "Generic (PLEG): container finished" podID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerID="a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181" exitCode=0 Apr 16 14:11:33.295198 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.295116 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" Apr 16 14:11:33.295198 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.295111 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerDied","Data":"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181"} Apr 16 14:11:33.295198 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.295155 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt" event={"ID":"652dae22-4e3c-4697-8389-511dcaaa0f92","Type":"ContainerDied","Data":"0b59c3150e09d797acc7304cfd1dc3c4371162d357decceeb4016e89db95edae"} Apr 16 14:11:33.295198 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.295172 2574 scope.go:117] "RemoveContainer" containerID="a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181" Apr 16 14:11:33.302895 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.302877 2574 scope.go:117] "RemoveContainer" containerID="52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783" Apr 16 14:11:33.309883 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.309863 2574 scope.go:117] "RemoveContainer" containerID="71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab" Apr 16 14:11:33.314034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.314011 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:11:33.318501 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.318478 2574 scope.go:117] "RemoveContainer" containerID="a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181" Apr 16 14:11:33.318757 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.318736 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-5cc75f5dfc-w9lgt"] Apr 16 14:11:33.318816 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:11:33.318766 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181\": container with ID starting with a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181 not found: ID does not exist" containerID="a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181" Apr 16 14:11:33.318816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.318791 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181"} err="failed to get container status \"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181\": rpc error: code = NotFound desc = could not find container \"a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181\": container with ID starting with a6762720b1c5b9c87192209a0d68616dc61fbd431ffc9647098b34bfd5253181 not found: ID does not exist" Apr 16 14:11:33.318816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.318808 2574 scope.go:117] "RemoveContainer" containerID="52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783" Apr 16 14:11:33.319069 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:11:33.319052 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783\": container with ID starting with 52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783 not found: ID does not exist" containerID="52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783" Apr 16 14:11:33.319117 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.319073 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783"} err="failed to get container status \"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783\": rpc error: code = NotFound desc = could not find container \"52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783\": container with ID starting with 52ac39c80398e8e93099b0131b79bf39611bf2fd91812954c11bba5a8b535783 not found: ID does not exist" Apr 16 14:11:33.319117 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.319087 2574 scope.go:117] "RemoveContainer" containerID="71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab" Apr 16 14:11:33.319332 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:11:33.319314 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab\": container with ID starting with 71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab not found: ID does not exist" containerID="71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab" Apr 16 14:11:33.319390 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:33.319340 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab"} err="failed to get container status \"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab\": rpc error: code = NotFound desc = could not find container \"71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab\": container with ID starting with 71c7a93f3ef6289fbc0345773f8a07dabbeb8164a90fc2171612d23cbec5c3ab not found: ID does not exist" Apr 16 14:11:35.186856 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:35.186823 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" path="/var/lib/kubelet/pods/652dae22-4e3c-4697-8389-511dcaaa0f92/volumes" Apr 16 14:11:40.229391 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:40.229338 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:40.229968 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:40.229857 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:11:50.228923 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:50.228874 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:11:50.229433 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:11:50.229413 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:00.229176 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:00.229127 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:12:00.229724 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:00.229700 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:10.229139 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:10.229087 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:12:10.229688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:10.229493 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:20.229798 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:20.229762 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:20.230226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:20.229930 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:28.235513 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.235465 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:12:28.235905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.235801 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" containerID="cri-o://e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557" gracePeriod=30 Apr 16 14:12:28.235959 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.235898 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" containerID="cri-o://79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb" gracePeriod=30 Apr 16 14:12:28.239006 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.238965 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:12:28.239355 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239339 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="storage-initializer" Apr 16 14:12:28.239355 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239357 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="storage-initializer" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239366 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239371 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239379 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239385 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239431 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="kserve-container" Apr 16 14:12:28.239456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.239441 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="652dae22-4e3c-4697-8389-511dcaaa0f92" containerName="agent" Apr 16 14:12:28.243556 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.243541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:12:28.249891 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.249866 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:12:28.253397 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.253380 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:12:28.373538 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.373504 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:12:28.378296 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:12:28.378266 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45b061df_9b40_4b85_8dbe_cca888687430.slice/crio-1ad7a50347b6ea426a6b087fa10fa1aa368ebcff0a8422372248be0147909322 WatchSource:0}: Error finding container 1ad7a50347b6ea426a6b087fa10fa1aa368ebcff0a8422372248be0147909322: Status 404 returned error can't find the container with id 1ad7a50347b6ea426a6b087fa10fa1aa368ebcff0a8422372248be0147909322 Apr 16 14:12:28.380479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.380462 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:12:28.457269 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:28.457238 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" event={"ID":"45b061df-9b40-4b85-8dbe-cca888687430","Type":"ContainerStarted","Data":"1ad7a50347b6ea426a6b087fa10fa1aa368ebcff0a8422372248be0147909322"} Apr 16 14:12:30.228691 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.228651 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:12:30.229100 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.228986 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:30.464464 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.464423 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" event={"ID":"45b061df-9b40-4b85-8dbe-cca888687430","Type":"ContainerStarted","Data":"4ff0009138361e47fd3d05926fead85f47df95e88d378dfe0e66c7bdef8bcbab"} Apr 16 14:12:30.464682 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.464623 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:12:30.466045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.466024 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:12:30.480735 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:30.480640 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" podStartSLOduration=1.330601615 podStartE2EDuration="2.480628687s" podCreationTimestamp="2026-04-16 14:12:28 +0000 UTC" firstStartedPulling="2026-04-16 14:12:28.380638395 +0000 UTC m=+789.742181142" lastFinishedPulling="2026-04-16 14:12:29.530665464 +0000 UTC m=+790.892208214" observedRunningTime="2026-04-16 14:12:30.479832639 +0000 UTC m=+791.841375407" watchObservedRunningTime="2026-04-16 14:12:30.480628687 +0000 UTC m=+791.842171455" Apr 16 14:12:33.476534 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:33.476501 2574 generic.go:358] "Generic (PLEG): container finished" podID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerID="e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557" exitCode=0 Apr 16 14:12:33.476903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:33.476574 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerDied","Data":"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557"} Apr 16 14:12:38.228064 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.227983 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:12:38.231509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.231487 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:38.241076 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.241046 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:12:38.273091 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.273052 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location\") pod \"isvc-logger-predictor-5c7f8599d6-7bwlv\" (UID: \"0b5e73ac-2198-427b-9709-5123d8216858\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:38.374116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.374080 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location\") pod \"isvc-logger-predictor-5c7f8599d6-7bwlv\" (UID: \"0b5e73ac-2198-427b-9709-5123d8216858\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:38.374481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.374461 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location\") pod \"isvc-logger-predictor-5c7f8599d6-7bwlv\" (UID: \"0b5e73ac-2198-427b-9709-5123d8216858\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:38.542152 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.542075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:38.664345 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:38.664319 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:12:38.666552 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:12:38.666521 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b5e73ac_2198_427b_9709_5123d8216858.slice/crio-ffd76521e9b77fd9229be28b3eccc9818869b2854d6950db751dc96b19f345d4 WatchSource:0}: Error finding container ffd76521e9b77fd9229be28b3eccc9818869b2854d6950db751dc96b19f345d4: Status 404 returned error can't find the container with id ffd76521e9b77fd9229be28b3eccc9818869b2854d6950db751dc96b19f345d4 Apr 16 14:12:39.495529 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:39.495492 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerStarted","Data":"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696"} Apr 16 14:12:39.495529 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:39.495527 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerStarted","Data":"ffd76521e9b77fd9229be28b3eccc9818869b2854d6950db751dc96b19f345d4"} Apr 16 14:12:40.229397 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:40.229351 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:12:40.229689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:40.229666 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:42.506296 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:42.506261 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b5e73ac-2198-427b-9709-5123d8216858" containerID="a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696" exitCode=0 Apr 16 14:12:42.506781 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:42.506339 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerDied","Data":"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696"} Apr 16 14:12:43.511800 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.511766 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerStarted","Data":"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c"} Apr 16 14:12:43.511800 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.511806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerStarted","Data":"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157"} Apr 16 14:12:43.512256 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.512103 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:43.512256 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.512138 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:12:43.513624 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.513571 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:12:43.514240 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.514218 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:43.528287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:43.528242 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podStartSLOduration=5.528229544 podStartE2EDuration="5.528229544s" podCreationTimestamp="2026-04-16 14:12:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:12:43.526604022 +0000 UTC m=+804.888146785" watchObservedRunningTime="2026-04-16 14:12:43.528229544 +0000 UTC m=+804.889772374" Apr 16 14:12:44.515509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:44.515466 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:12:44.515949 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:44.515905 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:50.229190 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:50.229147 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.26:5000: connect: connection refused" Apr 16 14:12:50.229786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:50.229281 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:50.229786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:50.229522 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:50.229786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:50.229654 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:54.516070 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:54.516021 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:12:54.516570 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:54.516548 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:12:58.383249 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.383225 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:58.555947 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.555860 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location\") pod \"b92cbcff-4baf-425a-9b31-c0d2098836d6\" (UID: \"b92cbcff-4baf-425a-9b31-c0d2098836d6\") " Apr 16 14:12:58.556241 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.556220 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b92cbcff-4baf-425a-9b31-c0d2098836d6" (UID: "b92cbcff-4baf-425a-9b31-c0d2098836d6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:12:58.560207 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.560181 2574 generic.go:358] "Generic (PLEG): container finished" podID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerID="79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb" exitCode=0 Apr 16 14:12:58.560346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.560253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerDied","Data":"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb"} Apr 16 14:12:58.560346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.560290 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" event={"ID":"b92cbcff-4baf-425a-9b31-c0d2098836d6","Type":"ContainerDied","Data":"fa0c46d89e6fbcdc9dccbfe6b0f86131dba69021ff8ef6d75ad9782d7d1eb6c4"} Apr 16 14:12:58.560346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.560311 2574 scope.go:117] "RemoveContainer" containerID="79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb" Apr 16 14:12:58.560346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.560316 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp" Apr 16 14:12:58.568150 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.568127 2574 scope.go:117] "RemoveContainer" containerID="e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557" Apr 16 14:12:58.575638 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.575565 2574 scope.go:117] "RemoveContainer" containerID="7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b" Apr 16 14:12:58.580103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.580082 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:12:58.583381 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.583363 2574 scope.go:117] "RemoveContainer" containerID="79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb" Apr 16 14:12:58.583702 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:12:58.583676 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb\": container with ID starting with 79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb not found: ID does not exist" containerID="79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb" Apr 16 14:12:58.583785 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.583713 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb"} err="failed to get container status \"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb\": rpc error: code = NotFound desc = could not find container \"79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb\": container with ID starting with 79f0da61e00e1601f2ac7e53575ba948a701d09d50d815c6eb665006ba3359bb not found: ID does not exist" Apr 16 14:12:58.583785 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.583738 2574 scope.go:117] "RemoveContainer" containerID="e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557" Apr 16 14:12:58.584023 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:12:58.583996 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557\": container with ID starting with e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557 not found: ID does not exist" containerID="e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557" Apr 16 14:12:58.584132 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.584025 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557"} err="failed to get container status \"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557\": rpc error: code = NotFound desc = could not find container \"e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557\": container with ID starting with e49b0769efef9ce9c9f6e3ee0510eb70730dbe3c03319ebbed872e2a90717557 not found: ID does not exist" Apr 16 14:12:58.584132 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.584048 2574 scope.go:117] "RemoveContainer" containerID="7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b" Apr 16 14:12:58.584302 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:12:58.584282 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b\": container with ID starting with 7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b not found: ID does not exist" containerID="7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b" Apr 16 14:12:58.584343 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.584309 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b"} err="failed to get container status \"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b\": rpc error: code = NotFound desc = could not find container \"7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b\": container with ID starting with 7e275f91e1aa6d417156f73c30ee3163160c3ab5e48a1a534fbeee0e4230338b not found: ID does not exist" Apr 16 14:12:58.584631 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.584615 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-84d4dd4cc8-m2ksp"] Apr 16 14:12:58.656862 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:58.656822 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b92cbcff-4baf-425a-9b31-c0d2098836d6-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:12:59.186793 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:12:59.186765 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" path="/var/lib/kubelet/pods/b92cbcff-4baf-425a-9b31-c0d2098836d6/volumes" Apr 16 14:13:04.516034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:04.515989 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:13:04.516537 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:04.516515 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:14.516329 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:14.516276 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:13:14.516843 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:14.516757 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:24.515976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:24.515924 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:13:24.516446 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:24.516323 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:34.516254 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:34.516199 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:13:34.516688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:34.516610 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:44.515522 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:44.515464 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:13:44.516025 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:44.516001 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:13:48.183776 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:48.183745 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:13:48.184194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:13:48.184101 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:14:03.294174 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.294141 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_message-dumper-predictor-7f66cccfb6-qjbhn_45b061df-9b40-4b85-8dbe-cca888687430/kserve-container/0.log" Apr 16 14:14:03.485131 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.485101 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:14:03.485428 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.485403 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" containerID="cri-o://55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157" gracePeriod=30 Apr 16 14:14:03.485554 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.485511 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" containerID="cri-o://bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c" gracePeriod=30 Apr 16 14:14:03.509748 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.509713 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:14:03.510038 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510026 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="storage-initializer" Apr 16 14:14:03.510081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510041 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="storage-initializer" Apr 16 14:14:03.510081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510055 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" Apr 16 14:14:03.510081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510064 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" Apr 16 14:14:03.510081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510080 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" Apr 16 14:14:03.510223 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510086 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" Apr 16 14:14:03.510223 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510138 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="agent" Apr 16 14:14:03.510223 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.510150 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b92cbcff-4baf-425a-9b31-c0d2098836d6" containerName="kserve-container" Apr 16 14:14:03.513310 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.513287 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:03.522178 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.522154 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:14:03.618946 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.618915 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:14:03.619183 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.619158 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" podUID="45b061df-9b40-4b85-8dbe-cca888687430" containerName="kserve-container" containerID="cri-o://4ff0009138361e47fd3d05926fead85f47df95e88d378dfe0e66c7bdef8bcbab" gracePeriod=30 Apr 16 14:14:03.686196 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.686157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-x5sm6\" (UID: \"9551ed81-a643-4eb8-a3d8-32b0b867d78e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:03.768107 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.768076 2574 generic.go:358] "Generic (PLEG): container finished" podID="45b061df-9b40-4b85-8dbe-cca888687430" containerID="4ff0009138361e47fd3d05926fead85f47df95e88d378dfe0e66c7bdef8bcbab" exitCode=2 Apr 16 14:14:03.768263 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.768142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" event={"ID":"45b061df-9b40-4b85-8dbe-cca888687430","Type":"ContainerDied","Data":"4ff0009138361e47fd3d05926fead85f47df95e88d378dfe0e66c7bdef8bcbab"} Apr 16 14:14:03.787142 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.787106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-x5sm6\" (UID: \"9551ed81-a643-4eb8-a3d8-32b0b867d78e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:03.787451 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.787432 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-x5sm6\" (UID: \"9551ed81-a643-4eb8-a3d8-32b0b867d78e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:03.846574 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.846551 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:03.864090 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.864071 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:14:03.967232 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:03.967200 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:14:03.971431 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:14:03.971401 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9551ed81_a643_4eb8_a3d8_32b0b867d78e.slice/crio-94658b9dbe99d46fa3c2c122b37f4564f17242dec3e6cf6a0f83b4c9c7450037 WatchSource:0}: Error finding container 94658b9dbe99d46fa3c2c122b37f4564f17242dec3e6cf6a0f83b4c9c7450037: Status 404 returned error can't find the container with id 94658b9dbe99d46fa3c2c122b37f4564f17242dec3e6cf6a0f83b4c9c7450037 Apr 16 14:14:04.771905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.771870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" event={"ID":"45b061df-9b40-4b85-8dbe-cca888687430","Type":"ContainerDied","Data":"1ad7a50347b6ea426a6b087fa10fa1aa368ebcff0a8422372248be0147909322"} Apr 16 14:14:04.771905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.771901 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn" Apr 16 14:14:04.772413 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.771915 2574 scope.go:117] "RemoveContainer" containerID="4ff0009138361e47fd3d05926fead85f47df95e88d378dfe0e66c7bdef8bcbab" Apr 16 14:14:04.773492 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.773463 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerStarted","Data":"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f"} Apr 16 14:14:04.773632 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.773504 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerStarted","Data":"94658b9dbe99d46fa3c2c122b37f4564f17242dec3e6cf6a0f83b4c9c7450037"} Apr 16 14:14:04.802467 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.802402 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:14:04.804254 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:04.804228 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/message-dumper-predictor-7f66cccfb6-qjbhn"] Apr 16 14:14:05.187561 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:05.187532 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45b061df-9b40-4b85-8dbe-cca888687430" path="/var/lib/kubelet/pods/45b061df-9b40-4b85-8dbe-cca888687430/volumes" Apr 16 14:14:07.786048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:07.786014 2574 generic.go:358] "Generic (PLEG): container finished" podID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerID="0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f" exitCode=0 Apr 16 14:14:07.786394 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:07.786079 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerDied","Data":"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f"} Apr 16 14:14:08.183519 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:08.183472 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:14:08.183828 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:08.183800 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:08.792906 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:08.792825 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b5e73ac-2198-427b-9709-5123d8216858" containerID="55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157" exitCode=0 Apr 16 14:14:08.792906 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:08.792845 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerDied","Data":"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157"} Apr 16 14:14:15.818405 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:15.818373 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerStarted","Data":"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd"} Apr 16 14:14:15.818873 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:15.818695 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:14:15.819930 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:15.819905 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:14:15.834941 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:15.834885 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podStartSLOduration=5.383881705 podStartE2EDuration="12.834871466s" podCreationTimestamp="2026-04-16 14:14:03 +0000 UTC" firstStartedPulling="2026-04-16 14:14:07.787339238 +0000 UTC m=+889.148881984" lastFinishedPulling="2026-04-16 14:14:15.238328998 +0000 UTC m=+896.599871745" observedRunningTime="2026-04-16 14:14:15.832921126 +0000 UTC m=+897.194463906" watchObservedRunningTime="2026-04-16 14:14:15.834871466 +0000 UTC m=+897.196414234" Apr 16 14:14:16.822055 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:16.822022 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:14:18.184362 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:18.184312 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:14:18.184816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:18.184611 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:19.110994 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:19.110965 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:14:19.112100 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:19.112081 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:14:26.822675 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:26.822630 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:14:28.183502 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:28.183456 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.28:8080: connect: connection refused" Apr 16 14:14:28.183983 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:28.183644 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:14:28.183983 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:28.183804 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:14:28.183983 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:28.183899 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:14:33.666072 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.666047 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:14:33.723989 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.723952 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location\") pod \"0b5e73ac-2198-427b-9709-5123d8216858\" (UID: \"0b5e73ac-2198-427b-9709-5123d8216858\") " Apr 16 14:14:33.724282 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.724257 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0b5e73ac-2198-427b-9709-5123d8216858" (UID: "0b5e73ac-2198-427b-9709-5123d8216858"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:14:33.824939 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.824909 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0b5e73ac-2198-427b-9709-5123d8216858-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:14:33.871460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.871426 2574 generic.go:358] "Generic (PLEG): container finished" podID="0b5e73ac-2198-427b-9709-5123d8216858" containerID="bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c" exitCode=137 Apr 16 14:14:33.871682 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.871512 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" Apr 16 14:14:33.871682 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.871509 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerDied","Data":"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c"} Apr 16 14:14:33.871682 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.871624 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv" event={"ID":"0b5e73ac-2198-427b-9709-5123d8216858","Type":"ContainerDied","Data":"ffd76521e9b77fd9229be28b3eccc9818869b2854d6950db751dc96b19f345d4"} Apr 16 14:14:33.871682 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.871641 2574 scope.go:117] "RemoveContainer" containerID="bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c" Apr 16 14:14:33.879939 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.879919 2574 scope.go:117] "RemoveContainer" containerID="55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157" Apr 16 14:14:33.887244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.887225 2574 scope.go:117] "RemoveContainer" containerID="a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696" Apr 16 14:14:33.892092 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.892073 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:14:33.894665 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.894644 2574 scope.go:117] "RemoveContainer" containerID="bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c" Apr 16 14:14:33.895009 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:14:33.894985 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c\": container with ID starting with bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c not found: ID does not exist" containerID="bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c" Apr 16 14:14:33.895212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.895178 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c"} err="failed to get container status \"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c\": rpc error: code = NotFound desc = could not find container \"bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c\": container with ID starting with bae4bb1ed01fd8abddd163e51d876b9727cfa4a296585e45b83ecc0075a12c4c not found: ID does not exist" Apr 16 14:14:33.895212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.895213 2574 scope.go:117] "RemoveContainer" containerID="55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157" Apr 16 14:14:33.895533 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:14:33.895514 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157\": container with ID starting with 55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157 not found: ID does not exist" containerID="55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157" Apr 16 14:14:33.895639 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.895538 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157"} err="failed to get container status \"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157\": rpc error: code = NotFound desc = could not find container \"55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157\": container with ID starting with 55ef32cde1044ad07bf9216358c3fc104044389b87beeaaf3a494b7a04dab157 not found: ID does not exist" Apr 16 14:14:33.895639 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.895556 2574 scope.go:117] "RemoveContainer" containerID="a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696" Apr 16 14:14:33.895915 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:14:33.895895 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696\": container with ID starting with a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696 not found: ID does not exist" containerID="a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696" Apr 16 14:14:33.895985 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.895922 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696"} err="failed to get container status \"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696\": rpc error: code = NotFound desc = could not find container \"a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696\": container with ID starting with a036a7d4031566c72cef6a3753ccaf8c469905cfe2ca13f7e00b662613cd1696 not found: ID does not exist" Apr 16 14:14:33.896550 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:33.896533 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-5c7f8599d6-7bwlv"] Apr 16 14:14:35.191075 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:35.188412 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b5e73ac-2198-427b-9709-5123d8216858" path="/var/lib/kubelet/pods/0b5e73ac-2198-427b-9709-5123d8216858/volumes" Apr 16 14:14:36.822610 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:36.822539 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:14:46.822238 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:46.822193 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:14:56.822302 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:14:56.822257 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:06.822626 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:06.822552 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:16.822257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:16.822209 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:17.184156 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:17.184066 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:27.184697 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:27.184649 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:37.187383 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:37.187304 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:15:43.687594 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.687552 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:15:43.688116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.687833 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" containerID="cri-o://a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd" gracePeriod=30 Apr 16 14:15:43.761990 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.761953 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:15:43.762279 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762266 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" Apr 16 14:15:43.762323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762281 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" Apr 16 14:15:43.762323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762294 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" Apr 16 14:15:43.762323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762300 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" Apr 16 14:15:43.762323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762308 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="45b061df-9b40-4b85-8dbe-cca888687430" containerName="kserve-container" Apr 16 14:15:43.762323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762313 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="45b061df-9b40-4b85-8dbe-cca888687430" containerName="kserve-container" Apr 16 14:15:43.762474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762329 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="storage-initializer" Apr 16 14:15:43.762474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762337 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="storage-initializer" Apr 16 14:15:43.762474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762389 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="45b061df-9b40-4b85-8dbe-cca888687430" containerName="kserve-container" Apr 16 14:15:43.762474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762397 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="kserve-container" Apr 16 14:15:43.762474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.762403 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0b5e73ac-2198-427b-9709-5123d8216858" containerName="agent" Apr 16 14:15:43.765367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.765346 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:43.773002 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.772977 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:15:43.888014 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.887976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-7l749\" (UID: \"17484290-2c79-438e-a86c-b62e26bc6644\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:43.989392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.989298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-7l749\" (UID: \"17484290-2c79-438e-a86c-b62e26bc6644\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:43.989678 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:43.989659 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-7l749\" (UID: \"17484290-2c79-438e-a86c-b62e26bc6644\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:44.076103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:44.076071 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:44.199178 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:44.199144 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:15:44.203297 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:15:44.203269 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17484290_2c79_438e_a86c_b62e26bc6644.slice/crio-f66d13f09dd8ddbc400e35f425ff3b064bbdc449ed8e75a73e689c2dd90c385b WatchSource:0}: Error finding container f66d13f09dd8ddbc400e35f425ff3b064bbdc449ed8e75a73e689c2dd90c385b: Status 404 returned error can't find the container with id f66d13f09dd8ddbc400e35f425ff3b064bbdc449ed8e75a73e689c2dd90c385b Apr 16 14:15:45.073656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:45.073620 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerStarted","Data":"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333"} Apr 16 14:15:45.073656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:45.073658 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerStarted","Data":"f66d13f09dd8ddbc400e35f425ff3b064bbdc449ed8e75a73e689c2dd90c385b"} Apr 16 14:15:47.184628 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:47.184573 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.29:8080: connect: connection refused" Apr 16 14:15:48.083902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.083867 2574 generic.go:358] "Generic (PLEG): container finished" podID="17484290-2c79-438e-a86c-b62e26bc6644" containerID="5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333" exitCode=0 Apr 16 14:15:48.084080 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.083942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerDied","Data":"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333"} Apr 16 14:15:48.625762 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.625739 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:15:48.723104 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.723069 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location\") pod \"9551ed81-a643-4eb8-a3d8-32b0b867d78e\" (UID: \"9551ed81-a643-4eb8-a3d8-32b0b867d78e\") " Apr 16 14:15:48.723428 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.723405 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9551ed81-a643-4eb8-a3d8-32b0b867d78e" (UID: "9551ed81-a643-4eb8-a3d8-32b0b867d78e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:48.824102 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:48.824077 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9551ed81-a643-4eb8-a3d8-32b0b867d78e-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:15:49.088607 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.088500 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerStarted","Data":"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84"} Apr 16 14:15:49.088847 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.088813 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:15:49.089933 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.089911 2574 generic.go:358] "Generic (PLEG): container finished" podID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerID="a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd" exitCode=0 Apr 16 14:15:49.090024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.089982 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" Apr 16 14:15:49.090024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.089994 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerDied","Data":"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd"} Apr 16 14:15:49.090094 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.090025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6" event={"ID":"9551ed81-a643-4eb8-a3d8-32b0b867d78e","Type":"ContainerDied","Data":"94658b9dbe99d46fa3c2c122b37f4564f17242dec3e6cf6a0f83b4c9c7450037"} Apr 16 14:15:49.090094 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.090041 2574 scope.go:117] "RemoveContainer" containerID="a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd" Apr 16 14:15:49.090486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.090462 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:15:49.097913 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.097571 2574 scope.go:117] "RemoveContainer" containerID="0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f" Apr 16 14:15:49.103446 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.103407 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podStartSLOduration=6.103394529 podStartE2EDuration="6.103394529s" podCreationTimestamp="2026-04-16 14:15:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:15:49.102260494 +0000 UTC m=+990.463803263" watchObservedRunningTime="2026-04-16 14:15:49.103394529 +0000 UTC m=+990.464937298" Apr 16 14:15:49.105024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.105005 2574 scope.go:117] "RemoveContainer" containerID="a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd" Apr 16 14:15:49.105288 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:15:49.105260 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd\": container with ID starting with a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd not found: ID does not exist" containerID="a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd" Apr 16 14:15:49.105373 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.105299 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd"} err="failed to get container status \"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd\": rpc error: code = NotFound desc = could not find container \"a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd\": container with ID starting with a029ce1a8b254c7c57e760d16a28d0f8474a5f8023646f737a4dadbce166e8fd not found: ID does not exist" Apr 16 14:15:49.105373 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.105318 2574 scope.go:117] "RemoveContainer" containerID="0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f" Apr 16 14:15:49.105538 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:15:49.105520 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f\": container with ID starting with 0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f not found: ID does not exist" containerID="0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f" Apr 16 14:15:49.105604 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.105544 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f"} err="failed to get container status \"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f\": rpc error: code = NotFound desc = could not find container \"0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f\": container with ID starting with 0c63dea97e56e1c89a24cd7d3dee467cb10d850639fa7c95424cd5aa9265de5f not found: ID does not exist" Apr 16 14:15:49.114119 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.114082 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:15:49.115785 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.115764 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-x5sm6"] Apr 16 14:15:49.187683 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:49.187645 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" path="/var/lib/kubelet/pods/9551ed81-a643-4eb8-a3d8-32b0b867d78e/volumes" Apr 16 14:15:50.094485 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:15:50.094449 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:00.094897 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:00.094851 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:10.094880 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:10.094838 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:20.094951 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:20.094902 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:30.095403 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:30.095352 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:40.095206 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:40.095153 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:16:50.095099 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:16:50.095049 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:17:00.094595 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:00.094539 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.30:8080: connect: connection refused" Apr 16 14:17:10.095905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:10.095864 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:17:14.222402 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.222369 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:17:14.222891 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.222643 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" containerID="cri-o://29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84" gracePeriod=30 Apr 16 14:17:14.308482 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308442 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:17:14.308771 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308758 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="storage-initializer" Apr 16 14:17:14.308821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308772 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="storage-initializer" Apr 16 14:17:14.308821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308787 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" Apr 16 14:17:14.308821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308792 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" Apr 16 14:17:14.308928 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.308837 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="9551ed81-a643-4eb8-a3d8-32b0b867d78e" containerName="kserve-container" Apr 16 14:17:14.311814 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.311796 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:17:14.318903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.318522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:17:14.379720 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.379687 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m\" (UID: \"f91734ea-1d40-4e11-867f-a928d07aba7d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:17:14.481107 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.481007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m\" (UID: \"f91734ea-1d40-4e11-867f-a928d07aba7d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:17:14.481407 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.481388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m\" (UID: \"f91734ea-1d40-4e11-867f-a928d07aba7d\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:17:14.623039 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.623000 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:17:14.742036 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:14.742014 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:17:14.744563 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:17:14.744532 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf91734ea_1d40_4e11_867f_a928d07aba7d.slice/crio-a35489f89defd7a2d1331ec4d8092a231a06bfceff46e549b05fc8164a0dfe49 WatchSource:0}: Error finding container a35489f89defd7a2d1331ec4d8092a231a06bfceff46e549b05fc8164a0dfe49: Status 404 returned error can't find the container with id a35489f89defd7a2d1331ec4d8092a231a06bfceff46e549b05fc8164a0dfe49 Apr 16 14:17:15.341464 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:15.341427 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerStarted","Data":"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16"} Apr 16 14:17:15.341464 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:15.341466 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerStarted","Data":"a35489f89defd7a2d1331ec4d8092a231a06bfceff46e549b05fc8164a0dfe49"} Apr 16 14:17:18.859539 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:17:18.859501 2574 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17484290_2c79_438e_a86c_b62e26bc6644.slice/crio-29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17484290_2c79_438e_a86c_b62e26bc6644.slice/crio-conmon-29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:17:19.066693 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.066670 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:17:19.121406 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.121310 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location\") pod \"17484290-2c79-438e-a86c-b62e26bc6644\" (UID: \"17484290-2c79-438e-a86c-b62e26bc6644\") " Apr 16 14:17:19.121686 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.121662 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17484290-2c79-438e-a86c-b62e26bc6644" (UID: "17484290-2c79-438e-a86c-b62e26bc6644"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:19.222075 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.222038 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17484290-2c79-438e-a86c-b62e26bc6644-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:17:19.356189 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.356157 2574 generic.go:358] "Generic (PLEG): container finished" podID="17484290-2c79-438e-a86c-b62e26bc6644" containerID="29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84" exitCode=0 Apr 16 14:17:19.356393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.356247 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerDied","Data":"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84"} Apr 16 14:17:19.356393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.356279 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" Apr 16 14:17:19.356393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.356308 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749" event={"ID":"17484290-2c79-438e-a86c-b62e26bc6644","Type":"ContainerDied","Data":"f66d13f09dd8ddbc400e35f425ff3b064bbdc449ed8e75a73e689c2dd90c385b"} Apr 16 14:17:19.356393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.356353 2574 scope.go:117] "RemoveContainer" containerID="29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84" Apr 16 14:17:19.357557 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.357534 2574 generic.go:358] "Generic (PLEG): container finished" podID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerID="e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16" exitCode=0 Apr 16 14:17:19.357691 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.357560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerDied","Data":"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16"} Apr 16 14:17:19.364280 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.364260 2574 scope.go:117] "RemoveContainer" containerID="5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333" Apr 16 14:17:19.371104 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.371086 2574 scope.go:117] "RemoveContainer" containerID="29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84" Apr 16 14:17:19.371321 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:17:19.371305 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84\": container with ID starting with 29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84 not found: ID does not exist" containerID="29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84" Apr 16 14:17:19.371365 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.371330 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84"} err="failed to get container status \"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84\": rpc error: code = NotFound desc = could not find container \"29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84\": container with ID starting with 29c418231bb1d3dd88ef84e6c4038e27b6a6b99b7ae8037bc4db84c1d2ccea84 not found: ID does not exist" Apr 16 14:17:19.371365 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.371347 2574 scope.go:117] "RemoveContainer" containerID="5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333" Apr 16 14:17:19.371567 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:17:19.371527 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333\": container with ID starting with 5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333 not found: ID does not exist" containerID="5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333" Apr 16 14:17:19.371567 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.371547 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333"} err="failed to get container status \"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333\": rpc error: code = NotFound desc = could not find container \"5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333\": container with ID starting with 5c4139158759f0cb018a561170ba9c357eeb269b6b99fe384312ed0446375333 not found: ID does not exist" Apr 16 14:17:19.386317 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.386287 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:17:19.389257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:19.389233 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-7l749"] Apr 16 14:17:21.189599 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:17:21.189547 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17484290-2c79-438e-a86c-b62e26bc6644" path="/var/lib/kubelet/pods/17484290-2c79-438e-a86c-b62e26bc6644/volumes" Apr 16 14:19:42.107498 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:19:42.107464 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:19:42.108023 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:19:42.107991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:19:43.818875 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:19:43.818837 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerStarted","Data":"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb"} Apr 16 14:19:43.819365 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:19:43.818987 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:19:43.843860 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:19:43.843795 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" podStartSLOduration=6.406639763 podStartE2EDuration="2m29.843757005s" podCreationTimestamp="2026-04-16 14:17:14 +0000 UTC" firstStartedPulling="2026-04-16 14:17:19.358803767 +0000 UTC m=+1080.720346513" lastFinishedPulling="2026-04-16 14:19:42.795921008 +0000 UTC m=+1224.157463755" observedRunningTime="2026-04-16 14:19:43.843402958 +0000 UTC m=+1225.204945727" watchObservedRunningTime="2026-04-16 14:19:43.843757005 +0000 UTC m=+1225.205299767" Apr 16 14:20:14.827044 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:14.827013 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:20:24.529504 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.529470 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:20:24.529889 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.529759 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="kserve-container" containerID="cri-o://ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb" gracePeriod=30 Apr 16 14:20:24.639556 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639518 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:24.639889 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639876 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" Apr 16 14:20:24.639932 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639891 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" Apr 16 14:20:24.639932 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639911 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="storage-initializer" Apr 16 14:20:24.639932 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639918 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="storage-initializer" Apr 16 14:20:24.640028 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.639970 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="17484290-2c79-438e-a86c-b62e26bc6644" containerName="kserve-container" Apr 16 14:20:24.642876 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.642860 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:24.652460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.652435 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:24.778090 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.778045 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr\" (UID: \"ecd9d6fe-f45a-4429-b95b-2d7299217e25\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:24.878498 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.878461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr\" (UID: \"ecd9d6fe-f45a-4429-b95b-2d7299217e25\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:24.878849 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.878827 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr\" (UID: \"ecd9d6fe-f45a-4429-b95b-2d7299217e25\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:24.953437 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:24.953403 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:25.072746 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.072711 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:25.076057 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:20:25.076030 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd9d6fe_f45a_4429_b95b_2d7299217e25.slice/crio-7d6c2dc27bd2a731a180b86c0345a2d60282d0043c124f9c3eea141c69919cf1 WatchSource:0}: Error finding container 7d6c2dc27bd2a731a180b86c0345a2d60282d0043c124f9c3eea141c69919cf1: Status 404 returned error can't find the container with id 7d6c2dc27bd2a731a180b86c0345a2d60282d0043c124f9c3eea141c69919cf1 Apr 16 14:20:25.078212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.078184 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:20:25.661052 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.661030 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:20:25.786354 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.786255 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location\") pod \"f91734ea-1d40-4e11-867f-a928d07aba7d\" (UID: \"f91734ea-1d40-4e11-867f-a928d07aba7d\") " Apr 16 14:20:25.786666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.786645 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f91734ea-1d40-4e11-867f-a928d07aba7d" (UID: "f91734ea-1d40-4e11-867f-a928d07aba7d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:25.887768 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.887728 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f91734ea-1d40-4e11-867f-a928d07aba7d-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:20:25.941300 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.941269 2574 generic.go:358] "Generic (PLEG): container finished" podID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerID="ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb" exitCode=0 Apr 16 14:20:25.941443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.941338 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" Apr 16 14:20:25.941443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.941359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerDied","Data":"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb"} Apr 16 14:20:25.941443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.941411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m" event={"ID":"f91734ea-1d40-4e11-867f-a928d07aba7d","Type":"ContainerDied","Data":"a35489f89defd7a2d1331ec4d8092a231a06bfceff46e549b05fc8164a0dfe49"} Apr 16 14:20:25.941443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.941434 2574 scope.go:117] "RemoveContainer" containerID="ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb" Apr 16 14:20:25.942884 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.942856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerStarted","Data":"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5"} Apr 16 14:20:25.942884 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.942887 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerStarted","Data":"7d6c2dc27bd2a731a180b86c0345a2d60282d0043c124f9c3eea141c69919cf1"} Apr 16 14:20:25.949924 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.949900 2574 scope.go:117] "RemoveContainer" containerID="e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16" Apr 16 14:20:25.957035 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.957020 2574 scope.go:117] "RemoveContainer" containerID="ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb" Apr 16 14:20:25.957263 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:20:25.957243 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb\": container with ID starting with ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb not found: ID does not exist" containerID="ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb" Apr 16 14:20:25.957307 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.957270 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb"} err="failed to get container status \"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb\": rpc error: code = NotFound desc = could not find container \"ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb\": container with ID starting with ce6e645e3c47de174b577eea56f43ef5a376e05218912578d797b3cf5b7fd3bb not found: ID does not exist" Apr 16 14:20:25.957307 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.957285 2574 scope.go:117] "RemoveContainer" containerID="e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16" Apr 16 14:20:25.957466 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:20:25.957451 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16\": container with ID starting with e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16 not found: ID does not exist" containerID="e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16" Apr 16 14:20:25.957504 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.957470 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16"} err="failed to get container status \"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16\": rpc error: code = NotFound desc = could not find container \"e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16\": container with ID starting with e8a18ff193dc36b3d07fa1ee3cf46b089793d2b86f8d8415fbd0960ec2bbbd16 not found: ID does not exist" Apr 16 14:20:25.992248 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.992218 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:20:25.996108 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:25.996088 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-jvm4m"] Apr 16 14:20:27.187192 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:27.187160 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" path="/var/lib/kubelet/pods/f91734ea-1d40-4e11-867f-a928d07aba7d/volumes" Apr 16 14:20:28.953743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:28.953658 2574 generic.go:358] "Generic (PLEG): container finished" podID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerID="b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5" exitCode=0 Apr 16 14:20:28.953743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:28.953729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerDied","Data":"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5"} Apr 16 14:20:29.957962 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:29.957929 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerStarted","Data":"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087"} Apr 16 14:20:29.958414 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:29.958228 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:29.959663 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:29.959633 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 14:20:29.973657 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:29.973608 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" podStartSLOduration=5.973592687 podStartE2EDuration="5.973592687s" podCreationTimestamp="2026-04-16 14:20:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:29.97172227 +0000 UTC m=+1271.333265038" watchObservedRunningTime="2026-04-16 14:20:29.973592687 +0000 UTC m=+1271.335135445" Apr 16 14:20:30.961654 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:30.961609 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.32:8080: connect: connection refused" Apr 16 14:20:40.962770 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:40.962734 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:44.670639 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.670608 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:44.671003 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.670829 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" containerID="cri-o://2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087" gracePeriod=30 Apr 16 14:20:44.712804 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.712772 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:20:44.713110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.713097 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="storage-initializer" Apr 16 14:20:44.713164 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.713112 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="storage-initializer" Apr 16 14:20:44.713164 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.713130 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="kserve-container" Apr 16 14:20:44.713164 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.713138 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="kserve-container" Apr 16 14:20:44.713256 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.713214 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f91734ea-1d40-4e11-867f-a928d07aba7d" containerName="kserve-container" Apr 16 14:20:44.716075 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.716056 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:44.725247 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.725224 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:20:44.851089 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.851049 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx\" (UID: \"0e5fead1-dbf5-4ba1-996e-4e949138b5ee\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:44.952465 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.952363 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx\" (UID: \"0e5fead1-dbf5-4ba1-996e-4e949138b5ee\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:44.952806 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:44.952780 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx\" (UID: \"0e5fead1-dbf5-4ba1-996e-4e949138b5ee\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:45.027707 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.027660 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:45.159922 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.159887 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:20:45.164010 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:20:45.163981 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e5fead1_dbf5_4ba1_996e_4e949138b5ee.slice/crio-426942e88776b761ea218dde40d259f860d229079ff3f81f0330a5cfdf8f6e32 WatchSource:0}: Error finding container 426942e88776b761ea218dde40d259f860d229079ff3f81f0330a5cfdf8f6e32: Status 404 returned error can't find the container with id 426942e88776b761ea218dde40d259f860d229079ff3f81f0330a5cfdf8f6e32 Apr 16 14:20:45.408961 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.408935 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:45.556159 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.556074 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location\") pod \"ecd9d6fe-f45a-4429-b95b-2d7299217e25\" (UID: \"ecd9d6fe-f45a-4429-b95b-2d7299217e25\") " Apr 16 14:20:45.556467 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.556445 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ecd9d6fe-f45a-4429-b95b-2d7299217e25" (UID: "ecd9d6fe-f45a-4429-b95b-2d7299217e25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:45.657534 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:45.657498 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ecd9d6fe-f45a-4429-b95b-2d7299217e25-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:20:46.005165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.005114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerStarted","Data":"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044"} Apr 16 14:20:46.005165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.005169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerStarted","Data":"426942e88776b761ea218dde40d259f860d229079ff3f81f0330a5cfdf8f6e32"} Apr 16 14:20:46.006454 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.006430 2574 generic.go:358] "Generic (PLEG): container finished" podID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerID="2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087" exitCode=0 Apr 16 14:20:46.006536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.006480 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerDied","Data":"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087"} Apr 16 14:20:46.006536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.006504 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" event={"ID":"ecd9d6fe-f45a-4429-b95b-2d7299217e25","Type":"ContainerDied","Data":"7d6c2dc27bd2a731a180b86c0345a2d60282d0043c124f9c3eea141c69919cf1"} Apr 16 14:20:46.006536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.006504 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr" Apr 16 14:20:46.006536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.006516 2574 scope.go:117] "RemoveContainer" containerID="2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087" Apr 16 14:20:46.014607 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.014566 2574 scope.go:117] "RemoveContainer" containerID="b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5" Apr 16 14:20:46.024571 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.024544 2574 scope.go:117] "RemoveContainer" containerID="2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087" Apr 16 14:20:46.025045 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:20:46.025021 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087\": container with ID starting with 2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087 not found: ID does not exist" containerID="2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087" Apr 16 14:20:46.025125 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.025058 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087"} err="failed to get container status \"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087\": rpc error: code = NotFound desc = could not find container \"2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087\": container with ID starting with 2d42fbcf5b542cedd0690f630d6bdf352bc505d079735fbf29b190135d1be087 not found: ID does not exist" Apr 16 14:20:46.025125 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.025082 2574 scope.go:117] "RemoveContainer" containerID="b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5" Apr 16 14:20:46.025363 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:20:46.025347 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5\": container with ID starting with b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5 not found: ID does not exist" containerID="b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5" Apr 16 14:20:46.025405 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.025370 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5"} err="failed to get container status \"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5\": rpc error: code = NotFound desc = could not find container \"b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5\": container with ID starting with b854e216d2251760397a385249a19b895a728ba9ac803e9ea1a186bf137940c5 not found: ID does not exist" Apr 16 14:20:46.034522 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.034488 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:46.038219 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:46.038197 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-m7shr"] Apr 16 14:20:47.187713 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:47.187683 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" path="/var/lib/kubelet/pods/ecd9d6fe-f45a-4429-b95b-2d7299217e25/volumes" Apr 16 14:20:50.021701 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:50.021662 2574 generic.go:358] "Generic (PLEG): container finished" podID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerID="483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044" exitCode=0 Apr 16 14:20:50.022190 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:50.021706 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerDied","Data":"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044"} Apr 16 14:20:51.026693 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:51.026654 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerStarted","Data":"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840"} Apr 16 14:20:51.027078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:51.026871 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:20:51.042570 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:20:51.042524 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" podStartSLOduration=7.042508531 podStartE2EDuration="7.042508531s" podCreationTimestamp="2026-04-16 14:20:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:51.041328004 +0000 UTC m=+1292.402870774" watchObservedRunningTime="2026-04-16 14:20:51.042508531 +0000 UTC m=+1292.404051300" Apr 16 14:21:22.034666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:22.034634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:21:24.899636 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899598 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:21:24.900061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899892 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" Apr 16 14:21:24.900061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899902 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" Apr 16 14:21:24.900061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899926 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="storage-initializer" Apr 16 14:21:24.900061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899932 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="storage-initializer" Apr 16 14:21:24.900061 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.899975 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ecd9d6fe-f45a-4429-b95b-2d7299217e25" containerName="kserve-container" Apr 16 14:21:24.904014 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.903991 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:24.911111 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.910789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:21:24.948163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.948125 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:21:24.948458 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.948434 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="kserve-container" containerID="cri-o://bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840" gracePeriod=30 Apr 16 14:21:24.973897 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:24.973862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57b5b74956-2hnfq\" (UID: \"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:25.074415 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:25.074367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57b5b74956-2hnfq\" (UID: \"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:25.074788 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:25.074766 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-57b5b74956-2hnfq\" (UID: \"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:25.214916 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:25.214820 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:25.369856 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:25.369829 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:21:25.372490 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:21:25.372464 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddafaa76d_5cfd_4c42_8c6d_bb60c47bb7a4.slice/crio-6f9c6e8f111495c6a267a5f790ee76b401a1387b2067f146970957db50d2f383 WatchSource:0}: Error finding container 6f9c6e8f111495c6a267a5f790ee76b401a1387b2067f146970957db50d2f383: Status 404 returned error can't find the container with id 6f9c6e8f111495c6a267a5f790ee76b401a1387b2067f146970957db50d2f383 Apr 16 14:21:26.129318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.129284 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerStarted","Data":"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b"} Apr 16 14:21:26.129318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.129320 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerStarted","Data":"6f9c6e8f111495c6a267a5f790ee76b401a1387b2067f146970957db50d2f383"} Apr 16 14:21:26.290017 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.289990 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:21:26.385997 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.385900 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location\") pod \"0e5fead1-dbf5-4ba1-996e-4e949138b5ee\" (UID: \"0e5fead1-dbf5-4ba1-996e-4e949138b5ee\") " Apr 16 14:21:26.386273 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.386249 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0e5fead1-dbf5-4ba1-996e-4e949138b5ee" (UID: "0e5fead1-dbf5-4ba1-996e-4e949138b5ee"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:26.486647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:26.486612 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0e5fead1-dbf5-4ba1-996e-4e949138b5ee-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:21:27.134055 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.134023 2574 generic.go:358] "Generic (PLEG): container finished" podID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerID="bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840" exitCode=0 Apr 16 14:21:27.134481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.134092 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" Apr 16 14:21:27.134481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.134102 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerDied","Data":"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840"} Apr 16 14:21:27.134481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.134135 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx" event={"ID":"0e5fead1-dbf5-4ba1-996e-4e949138b5ee","Type":"ContainerDied","Data":"426942e88776b761ea218dde40d259f860d229079ff3f81f0330a5cfdf8f6e32"} Apr 16 14:21:27.134481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.134153 2574 scope.go:117] "RemoveContainer" containerID="bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840" Apr 16 14:21:27.142317 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.142293 2574 scope.go:117] "RemoveContainer" containerID="483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044" Apr 16 14:21:27.149146 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.149128 2574 scope.go:117] "RemoveContainer" containerID="bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840" Apr 16 14:21:27.149397 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:21:27.149378 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840\": container with ID starting with bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840 not found: ID does not exist" containerID="bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840" Apr 16 14:21:27.149450 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.149403 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840"} err="failed to get container status \"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840\": rpc error: code = NotFound desc = could not find container \"bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840\": container with ID starting with bdf67ad3ec6f89e398c65989df48e724577284b0c834ef517e0cce937897e840 not found: ID does not exist" Apr 16 14:21:27.149450 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.149419 2574 scope.go:117] "RemoveContainer" containerID="483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044" Apr 16 14:21:27.149617 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:21:27.149599 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044\": container with ID starting with 483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044 not found: ID does not exist" containerID="483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044" Apr 16 14:21:27.149708 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.149617 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044"} err="failed to get container status \"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044\": rpc error: code = NotFound desc = could not find container \"483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044\": container with ID starting with 483f8d00ba79aca3f64347def082be8491cb9a4cacd14ea8d899f42c79658044 not found: ID does not exist" Apr 16 14:21:27.155038 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.155014 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:21:27.159850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.159826 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-vtcxx"] Apr 16 14:21:27.186862 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:27.186835 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" path="/var/lib/kubelet/pods/0e5fead1-dbf5-4ba1-996e-4e949138b5ee/volumes" Apr 16 14:21:30.144992 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:30.144959 2574 generic.go:358] "Generic (PLEG): container finished" podID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerID="95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b" exitCode=0 Apr 16 14:21:30.145378 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:30.145029 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerDied","Data":"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b"} Apr 16 14:21:31.150629 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:31.150558 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerStarted","Data":"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e"} Apr 16 14:21:34.161244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:34.161211 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerStarted","Data":"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8"} Apr 16 14:21:34.161677 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:34.161463 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:21:34.179267 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:34.179215 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podStartSLOduration=6.869664997 podStartE2EDuration="10.179201044s" podCreationTimestamp="2026-04-16 14:21:24 +0000 UTC" firstStartedPulling="2026-04-16 14:21:30.210288309 +0000 UTC m=+1331.571831055" lastFinishedPulling="2026-04-16 14:21:33.519824342 +0000 UTC m=+1334.881367102" observedRunningTime="2026-04-16 14:21:34.177095821 +0000 UTC m=+1335.538638589" watchObservedRunningTime="2026-04-16 14:21:34.179201044 +0000 UTC m=+1335.540743811" Apr 16 14:21:35.164938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:21:35.164907 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:22:06.169918 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:06.169831 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:22:36.171372 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:36.171342 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:22:45.023829 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.023789 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:22:45.024234 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.024077 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" containerID="cri-o://f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e" gracePeriod=30 Apr 16 14:22:45.024234 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.024131 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-agent" containerID="cri-o://c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8" gracePeriod=30 Apr 16 14:22:45.084563 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084530 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:22:45.084846 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084833 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="kserve-container" Apr 16 14:22:45.084888 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084847 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="kserve-container" Apr 16 14:22:45.084888 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084857 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="storage-initializer" Apr 16 14:22:45.084888 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084863 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="storage-initializer" Apr 16 14:22:45.084983 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.084935 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0e5fead1-dbf5-4ba1-996e-4e949138b5ee" containerName="kserve-container" Apr 16 14:22:45.089209 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.089174 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:22:45.094357 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.094328 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:22:45.229842 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.229804 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-frsnw\" (UID: \"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:22:45.331133 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.331104 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-frsnw\" (UID: \"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:22:45.331444 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.331427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-frsnw\" (UID: \"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:22:45.400310 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.400273 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:22:45.524299 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:45.524257 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:22:45.526483 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:22:45.526454 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f7f1f1d_b6a4_45b6_8b2c_0c532d657d12.slice/crio-a556fa0f334024fe07a92adff4e9e23f34007755a54182fe51bbfca08c824985 WatchSource:0}: Error finding container a556fa0f334024fe07a92adff4e9e23f34007755a54182fe51bbfca08c824985: Status 404 returned error can't find the container with id a556fa0f334024fe07a92adff4e9e23f34007755a54182fe51bbfca08c824985 Apr 16 14:22:46.168556 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:46.168513 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 14:22:46.369277 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:46.369239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerStarted","Data":"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e"} Apr 16 14:22:46.369277 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:46.369276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerStarted","Data":"a556fa0f334024fe07a92adff4e9e23f34007755a54182fe51bbfca08c824985"} Apr 16 14:22:47.374895 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:47.374859 2574 generic.go:358] "Generic (PLEG): container finished" podID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerID="f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e" exitCode=0 Apr 16 14:22:47.375287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:47.374934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerDied","Data":"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e"} Apr 16 14:22:50.384912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:50.384880 2574 generic.go:358] "Generic (PLEG): container finished" podID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerID="ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e" exitCode=0 Apr 16 14:22:50.385373 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:50.384955 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerDied","Data":"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e"} Apr 16 14:22:56.169221 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:22:56.169163 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 14:23:02.429262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:02.429228 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerStarted","Data":"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979"} Apr 16 14:23:02.429695 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:02.429507 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:23:02.430904 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:02.430879 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:03.433392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:03.433356 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:06.168973 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:06.168931 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.34:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.133.0.34:8080: connect: connection refused" Apr 16 14:23:06.169425 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:06.169076 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:23:06.186789 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:06.186743 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podStartSLOduration=9.621358971 podStartE2EDuration="21.18672838s" podCreationTimestamp="2026-04-16 14:22:45 +0000 UTC" firstStartedPulling="2026-04-16 14:22:50.386153099 +0000 UTC m=+1411.747695846" lastFinishedPulling="2026-04-16 14:23:01.951522503 +0000 UTC m=+1423.313065255" observedRunningTime="2026-04-16 14:23:02.448496814 +0000 UTC m=+1423.810039592" watchObservedRunningTime="2026-04-16 14:23:06.18672838 +0000 UTC m=+1427.548271149" Apr 16 14:23:13.434294 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:13.434251 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:15.169153 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.169126 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:23:15.285153 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.285053 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location\") pod \"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4\" (UID: \"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4\") " Apr 16 14:23:15.285415 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.285390 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" (UID: "dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:15.385936 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.385898 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.470113 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.470081 2574 generic.go:358] "Generic (PLEG): container finished" podID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerID="c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8" exitCode=137 Apr 16 14:23:15.470283 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.470152 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" Apr 16 14:23:15.470283 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.470157 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerDied","Data":"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8"} Apr 16 14:23:15.470283 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.470198 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq" event={"ID":"dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4","Type":"ContainerDied","Data":"6f9c6e8f111495c6a267a5f790ee76b401a1387b2067f146970957db50d2f383"} Apr 16 14:23:15.470283 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.470213 2574 scope.go:117] "RemoveContainer" containerID="c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8" Apr 16 14:23:15.478280 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.478256 2574 scope.go:117] "RemoveContainer" containerID="f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e" Apr 16 14:23:15.489560 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.486774 2574 scope.go:117] "RemoveContainer" containerID="95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b" Apr 16 14:23:15.495055 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.495030 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:23:15.497320 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.497291 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-57b5b74956-2hnfq"] Apr 16 14:23:15.497745 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.497732 2574 scope.go:117] "RemoveContainer" containerID="c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8" Apr 16 14:23:15.498055 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:23:15.498033 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8\": container with ID starting with c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8 not found: ID does not exist" containerID="c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8" Apr 16 14:23:15.498143 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.498062 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8"} err="failed to get container status \"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8\": rpc error: code = NotFound desc = could not find container \"c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8\": container with ID starting with c4299bf40cf5ce7c70468857cd30195273b2234fed8570788c131da2f868ebd8 not found: ID does not exist" Apr 16 14:23:15.498143 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.498082 2574 scope.go:117] "RemoveContainer" containerID="f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e" Apr 16 14:23:15.498317 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:23:15.498300 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e\": container with ID starting with f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e not found: ID does not exist" containerID="f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e" Apr 16 14:23:15.498354 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.498322 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e"} err="failed to get container status \"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e\": rpc error: code = NotFound desc = could not find container \"f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e\": container with ID starting with f62fa2ce2a597753f9ecbaa4ff81997ea364efe73ba74f709eb3c34377c1842e not found: ID does not exist" Apr 16 14:23:15.498354 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.498334 2574 scope.go:117] "RemoveContainer" containerID="95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b" Apr 16 14:23:15.498520 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:23:15.498502 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b\": container with ID starting with 95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b not found: ID does not exist" containerID="95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b" Apr 16 14:23:15.498562 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:15.498523 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b"} err="failed to get container status \"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b\": rpc error: code = NotFound desc = could not find container \"95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b\": container with ID starting with 95f4c4fccde911c1893454cf68172e2d53eab4d2f9270d2977ea86cf9c49570b not found: ID does not exist" Apr 16 14:23:17.187901 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:17.187870 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" path="/var/lib/kubelet/pods/dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4/volumes" Apr 16 14:23:23.433704 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:23.433662 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:33.433537 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:33.433490 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:43.433933 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:43.433888 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.35:8080: connect: connection refused" Apr 16 14:23:53.434967 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:53.434933 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:23:56.662449 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.662412 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:23:56.662962 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.662686 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" containerID="cri-o://2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979" gracePeriod=30 Apr 16 14:23:56.744568 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744533 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:23:56.744880 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744866 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-agent" Apr 16 14:23:56.744935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744881 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-agent" Apr 16 14:23:56.744935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744892 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" Apr 16 14:23:56.744935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744897 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" Apr 16 14:23:56.744935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744908 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="storage-initializer" Apr 16 14:23:56.744935 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744914 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="storage-initializer" Apr 16 14:23:56.745098 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744963 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-agent" Apr 16 14:23:56.745098 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.744973 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dafaa76d-5cfd-4c42-8c6d-bb60c47bb7a4" containerName="kserve-container" Apr 16 14:23:56.747933 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.747917 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:23:56.761869 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.761846 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:23:56.920389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:56.920285 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-ktxc6\" (UID: \"21d0dbaa-6f1b-4c0a-903c-0399acad23e7\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:23:57.021708 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.021666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-ktxc6\" (UID: \"21d0dbaa-6f1b-4c0a-903c-0399acad23e7\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:23:57.022056 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.022034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-ktxc6\" (UID: \"21d0dbaa-6f1b-4c0a-903c-0399acad23e7\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:23:57.057849 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.057811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:23:57.189748 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.189718 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:23:57.190481 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:23:57.190456 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21d0dbaa_6f1b_4c0a_903c_0399acad23e7.slice/crio-3ca6675a0e2a31c5c6acb0e68b6c4326d11d7d4154ac8df30de78fea5623496e WatchSource:0}: Error finding container 3ca6675a0e2a31c5c6acb0e68b6c4326d11d7d4154ac8df30de78fea5623496e: Status 404 returned error can't find the container with id 3ca6675a0e2a31c5c6acb0e68b6c4326d11d7d4154ac8df30de78fea5623496e Apr 16 14:23:57.595551 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.595518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerStarted","Data":"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5"} Apr 16 14:23:57.595551 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:57.595553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerStarted","Data":"3ca6675a0e2a31c5c6acb0e68b6c4326d11d7d4154ac8df30de78fea5623496e"} Apr 16 14:23:59.599852 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.599825 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:23:59.602763 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.602734 2574 generic.go:358] "Generic (PLEG): container finished" podID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerID="2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979" exitCode=0 Apr 16 14:23:59.602912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.602785 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerDied","Data":"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979"} Apr 16 14:23:59.602912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.602805 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" Apr 16 14:23:59.602912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.602832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw" event={"ID":"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12","Type":"ContainerDied","Data":"a556fa0f334024fe07a92adff4e9e23f34007755a54182fe51bbfca08c824985"} Apr 16 14:23:59.602912 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.602855 2574 scope.go:117] "RemoveContainer" containerID="2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979" Apr 16 14:23:59.610399 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.610381 2574 scope.go:117] "RemoveContainer" containerID="ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e" Apr 16 14:23:59.618045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.618028 2574 scope.go:117] "RemoveContainer" containerID="2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979" Apr 16 14:23:59.618337 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:23:59.618306 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979\": container with ID starting with 2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979 not found: ID does not exist" containerID="2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979" Apr 16 14:23:59.618422 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.618335 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979"} err="failed to get container status \"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979\": rpc error: code = NotFound desc = could not find container \"2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979\": container with ID starting with 2dea5e2dfad7ccbae51f5a658d3da0779022f0cbeb47db32ff6ea274a4b4a979 not found: ID does not exist" Apr 16 14:23:59.618422 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.618359 2574 scope.go:117] "RemoveContainer" containerID="ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e" Apr 16 14:23:59.618628 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:23:59.618603 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e\": container with ID starting with ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e not found: ID does not exist" containerID="ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e" Apr 16 14:23:59.618737 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.618632 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e"} err="failed to get container status \"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e\": rpc error: code = NotFound desc = could not find container \"ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e\": container with ID starting with ce15366ba29296562984049b9e99a80b62fc3d3dc04f7c983bfc1e6bfaf0207e not found: ID does not exist" Apr 16 14:23:59.647128 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.647089 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location\") pod \"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12\" (UID: \"2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12\") " Apr 16 14:23:59.656792 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.656763 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" (UID: "2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:59.747926 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.747833 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:23:59.924290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.924253 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:23:59.930752 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:23:59.930726 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-frsnw"] Apr 16 14:24:01.187506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:01.187477 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" path="/var/lib/kubelet/pods/2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12/volumes" Apr 16 14:24:02.614428 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:02.614393 2574 generic.go:358] "Generic (PLEG): container finished" podID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerID="2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5" exitCode=0 Apr 16 14:24:02.614835 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:02.614467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerDied","Data":"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5"} Apr 16 14:24:03.618789 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:03.618753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerStarted","Data":"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411"} Apr 16 14:24:03.619220 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:03.619052 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:24:03.620257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:03.620236 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:03.637524 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:03.637482 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podStartSLOduration=7.637468505 podStartE2EDuration="7.637468505s" podCreationTimestamp="2026-04-16 14:23:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:24:03.634961442 +0000 UTC m=+1484.996504209" watchObservedRunningTime="2026-04-16 14:24:03.637468505 +0000 UTC m=+1484.999011270" Apr 16 14:24:04.621988 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:04.621954 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:14.622140 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:14.622090 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:24.622325 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:24.622278 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:34.622523 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:34.622472 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:42.131845 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:42.131767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:24:42.132445 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:42.132426 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:24:44.622806 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:44.622762 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.36:8080: connect: connection refused" Apr 16 14:24:54.623012 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:54.622978 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:24:58.210025 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.209991 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:24:58.210447 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.210255 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" containerID="cri-o://991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411" gracePeriod=30 Apr 16 14:24:58.285805 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.285769 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:24:58.286095 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.286082 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" Apr 16 14:24:58.286139 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.286097 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" Apr 16 14:24:58.286139 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.286113 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="storage-initializer" Apr 16 14:24:58.286139 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.286119 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="storage-initializer" Apr 16 14:24:58.286228 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.286163 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2f7f1f1d-b6a4-45b6-8b2c-0c532d657d12" containerName="kserve-container" Apr 16 14:24:58.289327 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.289306 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:24:58.298078 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.298046 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:24:58.426893 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.426858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf\" (UID: \"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:24:58.527945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.527858 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf\" (UID: \"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:24:58.528262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.528241 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf\" (UID: \"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:24:58.598951 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.598911 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:24:58.720714 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.720686 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:24:58.723610 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:24:58.723563 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b64d92d_56e6_4fe9_9aa2_b957bd7c832e.slice/crio-957a110a40944a59b1af0fdc4c2fe4bb42e9bc04816654e25dcb3868d3f0f4d5 WatchSource:0}: Error finding container 957a110a40944a59b1af0fdc4c2fe4bb42e9bc04816654e25dcb3868d3f0f4d5: Status 404 returned error can't find the container with id 957a110a40944a59b1af0fdc4c2fe4bb42e9bc04816654e25dcb3868d3f0f4d5 Apr 16 14:24:58.778889 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:58.778827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerStarted","Data":"957a110a40944a59b1af0fdc4c2fe4bb42e9bc04816654e25dcb3868d3f0f4d5"} Apr 16 14:24:59.782774 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:24:59.782738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerStarted","Data":"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b"} Apr 16 14:25:01.155457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.155435 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:25:01.252537 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.252447 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location\") pod \"21d0dbaa-6f1b-4c0a-903c-0399acad23e7\" (UID: \"21d0dbaa-6f1b-4c0a-903c-0399acad23e7\") " Apr 16 14:25:01.262231 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.262198 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "21d0dbaa-6f1b-4c0a-903c-0399acad23e7" (UID: "21d0dbaa-6f1b-4c0a-903c-0399acad23e7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:25:01.353928 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.353882 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/21d0dbaa-6f1b-4c0a-903c-0399acad23e7-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:25:01.791432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.791394 2574 generic.go:358] "Generic (PLEG): container finished" podID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerID="991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411" exitCode=0 Apr 16 14:25:01.791649 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.791460 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" Apr 16 14:25:01.791649 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.791475 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerDied","Data":"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411"} Apr 16 14:25:01.791649 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.791512 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6" event={"ID":"21d0dbaa-6f1b-4c0a-903c-0399acad23e7","Type":"ContainerDied","Data":"3ca6675a0e2a31c5c6acb0e68b6c4326d11d7d4154ac8df30de78fea5623496e"} Apr 16 14:25:01.791649 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.791528 2574 scope.go:117] "RemoveContainer" containerID="991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411" Apr 16 14:25:01.802367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.802343 2574 scope.go:117] "RemoveContainer" containerID="2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5" Apr 16 14:25:01.809514 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.809493 2574 scope.go:117] "RemoveContainer" containerID="991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411" Apr 16 14:25:01.809773 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:25:01.809755 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411\": container with ID starting with 991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411 not found: ID does not exist" containerID="991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411" Apr 16 14:25:01.809819 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.809783 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411"} err="failed to get container status \"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411\": rpc error: code = NotFound desc = could not find container \"991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411\": container with ID starting with 991af2da47e930f7a2151693985960d8f63dabfaf2d41963c494bb5c78b73411 not found: ID does not exist" Apr 16 14:25:01.809819 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.809801 2574 scope.go:117] "RemoveContainer" containerID="2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5" Apr 16 14:25:01.810034 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:25:01.810018 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5\": container with ID starting with 2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5 not found: ID does not exist" containerID="2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5" Apr 16 14:25:01.810071 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.810039 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5"} err="failed to get container status \"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5\": rpc error: code = NotFound desc = could not find container \"2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5\": container with ID starting with 2b044c2e497597d3488187cf5979c5085a004559a76603fac23de8a80fd5ecf5 not found: ID does not exist" Apr 16 14:25:01.818016 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.817990 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:25:01.824925 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:01.824902 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-ktxc6"] Apr 16 14:25:03.187419 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:03.187387 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" path="/var/lib/kubelet/pods/21d0dbaa-6f1b-4c0a-903c-0399acad23e7/volumes" Apr 16 14:25:03.799747 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:03.799715 2574 generic.go:358] "Generic (PLEG): container finished" podID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerID="b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b" exitCode=0 Apr 16 14:25:03.799944 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:03.799760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerDied","Data":"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b"} Apr 16 14:25:04.804730 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:04.804694 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerStarted","Data":"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787"} Apr 16 14:25:04.805120 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:04.805000 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:25:04.806417 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:04.806390 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:04.834901 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:04.834855 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podStartSLOduration=6.834841445 podStartE2EDuration="6.834841445s" podCreationTimestamp="2026-04-16 14:24:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:25:04.833159159 +0000 UTC m=+1546.194701924" watchObservedRunningTime="2026-04-16 14:25:04.834841445 +0000 UTC m=+1546.196384212" Apr 16 14:25:05.807987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:05.807952 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:15.808911 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:15.808870 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:25.808443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:25.808397 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:35.808457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:35.808419 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:45.808434 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:45.808397 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.37:8080: connect: connection refused" Apr 16 14:25:55.809446 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:25:55.809410 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:26:00.105006 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.104978 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:26:00.105435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.105228 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" containerID="cri-o://3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787" gracePeriod=30 Apr 16 14:26:00.191516 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191481 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:26:00.191790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191777 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" Apr 16 14:26:00.191838 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191792 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" Apr 16 14:26:00.191838 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191804 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="storage-initializer" Apr 16 14:26:00.191838 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191810 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="storage-initializer" Apr 16 14:26:00.191934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.191863 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="21d0dbaa-6f1b-4c0a-903c-0399acad23e7" containerName="kserve-container" Apr 16 14:26:00.194760 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.194727 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:00.203790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.203763 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:26:00.328895 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.328858 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7llpg\" (UID: \"429fa2ff-f6c2-4941-9b07-88bb8d71504d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:00.429473 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.429385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7llpg\" (UID: \"429fa2ff-f6c2-4941-9b07-88bb8d71504d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:00.429794 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.429775 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-7llpg\" (UID: \"429fa2ff-f6c2-4941-9b07-88bb8d71504d\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:00.505052 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.505018 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:00.630266 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.630088 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:26:00.632775 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:26:00.632746 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod429fa2ff_f6c2_4941_9b07_88bb8d71504d.slice/crio-c2a6624a4c8c96aa5a3a13225a4809e779e891971582a5c425353dcdd24cf0f4 WatchSource:0}: Error finding container c2a6624a4c8c96aa5a3a13225a4809e779e891971582a5c425353dcdd24cf0f4: Status 404 returned error can't find the container with id c2a6624a4c8c96aa5a3a13225a4809e779e891971582a5c425353dcdd24cf0f4 Apr 16 14:26:00.634945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.634922 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:26:00.973361 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.973273 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerStarted","Data":"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69"} Apr 16 14:26:00.973361 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:00.973323 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerStarted","Data":"c2a6624a4c8c96aa5a3a13225a4809e779e891971582a5c425353dcdd24cf0f4"} Apr 16 14:26:02.952060 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.952039 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:26:02.980002 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.979968 2574 generic.go:358] "Generic (PLEG): container finished" podID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerID="3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787" exitCode=0 Apr 16 14:26:02.980182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.980046 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" Apr 16 14:26:02.980182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.980043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerDied","Data":"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787"} Apr 16 14:26:02.980182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.980086 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf" event={"ID":"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e","Type":"ContainerDied","Data":"957a110a40944a59b1af0fdc4c2fe4bb42e9bc04816654e25dcb3868d3f0f4d5"} Apr 16 14:26:02.980182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.980103 2574 scope.go:117] "RemoveContainer" containerID="3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787" Apr 16 14:26:02.987855 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.987835 2574 scope.go:117] "RemoveContainer" containerID="b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b" Apr 16 14:26:02.994881 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.994862 2574 scope.go:117] "RemoveContainer" containerID="3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787" Apr 16 14:26:02.995129 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:26:02.995111 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787\": container with ID starting with 3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787 not found: ID does not exist" containerID="3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787" Apr 16 14:26:02.995189 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.995137 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787"} err="failed to get container status \"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787\": rpc error: code = NotFound desc = could not find container \"3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787\": container with ID starting with 3f78abc8fc5da5d171da5a2bd1b564e7cb72484dd40d29fbf18637dda45fe787 not found: ID does not exist" Apr 16 14:26:02.995189 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.995157 2574 scope.go:117] "RemoveContainer" containerID="b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b" Apr 16 14:26:02.995377 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:26:02.995360 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b\": container with ID starting with b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b not found: ID does not exist" containerID="b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b" Apr 16 14:26:02.995425 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:02.995383 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b"} err="failed to get container status \"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b\": rpc error: code = NotFound desc = could not find container \"b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b\": container with ID starting with b61dae75b70cfc34d8e56b16e8dd1e5aea5fc06e5197a4acfe75ccfc3cd26d0b not found: ID does not exist" Apr 16 14:26:03.051701 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:03.051665 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location\") pod \"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e\" (UID: \"3b64d92d-56e6-4fe9-9aa2-b957bd7c832e\") " Apr 16 14:26:03.061557 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:03.061529 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" (UID: "3b64d92d-56e6-4fe9-9aa2-b957bd7c832e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:03.152792 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:03.152754 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:26:03.295805 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:03.295730 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:26:03.299962 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:03.299938 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-2cjwf"] Apr 16 14:26:04.988177 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:04.988147 2574 generic.go:358] "Generic (PLEG): container finished" podID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerID="4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69" exitCode=0 Apr 16 14:26:04.988544 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:04.988225 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerDied","Data":"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69"} Apr 16 14:26:05.187114 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:05.187083 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" path="/var/lib/kubelet/pods/3b64d92d-56e6-4fe9-9aa2-b957bd7c832e/volumes" Apr 16 14:26:13.015836 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:13.015800 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerStarted","Data":"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389"} Apr 16 14:26:13.016362 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:13.016166 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:26:13.017360 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:13.017336 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:26:13.039303 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:13.039249 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podStartSLOduration=5.889402951 podStartE2EDuration="13.039233306s" podCreationTimestamp="2026-04-16 14:26:00 +0000 UTC" firstStartedPulling="2026-04-16 14:26:04.989319739 +0000 UTC m=+1606.350862486" lastFinishedPulling="2026-04-16 14:26:12.139150079 +0000 UTC m=+1613.500692841" observedRunningTime="2026-04-16 14:26:13.036546259 +0000 UTC m=+1614.398089029" watchObservedRunningTime="2026-04-16 14:26:13.039233306 +0000 UTC m=+1614.400776134" Apr 16 14:26:14.019055 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:14.019016 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:26:24.019245 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:24.019190 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:26:34.019850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:34.019793 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:26:44.019992 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:44.019946 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:26:54.019694 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:26:54.019651 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:27:04.020076 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:04.020028 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:27:14.019744 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:14.019701 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:27:24.019430 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:24.019379 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:27:33.186985 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:33.186955 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:27:41.204166 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.204076 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:27:41.204598 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.204375 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" containerID="cri-o://e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389" gracePeriod=30 Apr 16 14:27:41.305656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.305620 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:27:41.305927 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.305916 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" Apr 16 14:27:41.305976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.305929 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" Apr 16 14:27:41.305976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.305944 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="storage-initializer" Apr 16 14:27:41.305976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.305951 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="storage-initializer" Apr 16 14:27:41.306067 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.306014 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b64d92d-56e6-4fe9-9aa2-b957bd7c832e" containerName="kserve-container" Apr 16 14:27:41.308685 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.308668 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:41.320068 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.320042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:27:41.413662 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.413615 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-llq4b\" (UID: \"302acc2d-6069-4ecc-a4bf-d7e5275c0c03\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:41.514093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.513998 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-llq4b\" (UID: \"302acc2d-6069-4ecc-a4bf-d7e5275c0c03\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:41.514388 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.514367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-llq4b\" (UID: \"302acc2d-6069-4ecc-a4bf-d7e5275c0c03\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:41.618381 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.618332 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:41.735705 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:41.735672 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:27:41.738792 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:27:41.738763 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod302acc2d_6069_4ecc_a4bf_d7e5275c0c03.slice/crio-f178d3963952ff1c89c3cfd98175b09338ded67111a93bf9c41f81221acb24ed WatchSource:0}: Error finding container f178d3963952ff1c89c3cfd98175b09338ded67111a93bf9c41f81221acb24ed: Status 404 returned error can't find the container with id f178d3963952ff1c89c3cfd98175b09338ded67111a93bf9c41f81221acb24ed Apr 16 14:27:42.268184 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:42.268141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerStarted","Data":"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e"} Apr 16 14:27:42.268184 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:42.268189 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerStarted","Data":"f178d3963952ff1c89c3cfd98175b09338ded67111a93bf9c41f81221acb24ed"} Apr 16 14:27:43.183695 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:43.183656 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.38:8080: connect: connection refused" Apr 16 14:27:44.840179 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:44.840156 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:27:44.940353 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:44.940266 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location\") pod \"429fa2ff-f6c2-4941-9b07-88bb8d71504d\" (UID: \"429fa2ff-f6c2-4941-9b07-88bb8d71504d\") " Apr 16 14:27:44.940601 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:44.940565 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "429fa2ff-f6c2-4941-9b07-88bb8d71504d" (UID: "429fa2ff-f6c2-4941-9b07-88bb8d71504d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:45.040915 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.040877 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/429fa2ff-f6c2-4941-9b07-88bb8d71504d-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:27:45.278290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.278259 2574 generic.go:358] "Generic (PLEG): container finished" podID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerID="e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389" exitCode=0 Apr 16 14:27:45.278481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.278302 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerDied","Data":"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389"} Apr 16 14:27:45.278481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.278326 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" Apr 16 14:27:45.278481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.278345 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg" event={"ID":"429fa2ff-f6c2-4941-9b07-88bb8d71504d","Type":"ContainerDied","Data":"c2a6624a4c8c96aa5a3a13225a4809e779e891971582a5c425353dcdd24cf0f4"} Apr 16 14:27:45.278481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.278365 2574 scope.go:117] "RemoveContainer" containerID="e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389" Apr 16 14:27:45.285991 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.285972 2574 scope.go:117] "RemoveContainer" containerID="4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69" Apr 16 14:27:45.293521 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.293492 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:27:45.293881 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.293863 2574 scope.go:117] "RemoveContainer" containerID="e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389" Apr 16 14:27:45.296535 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.296510 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-7llpg"] Apr 16 14:27:45.297836 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:27:45.297812 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389\": container with ID starting with e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389 not found: ID does not exist" containerID="e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389" Apr 16 14:27:45.297976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.297845 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389"} err="failed to get container status \"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389\": rpc error: code = NotFound desc = could not find container \"e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389\": container with ID starting with e783e078c03d6e438c91538a5cdd5e5213c998db9172ac33457dee312fd10389 not found: ID does not exist" Apr 16 14:27:45.298034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.297979 2574 scope.go:117] "RemoveContainer" containerID="4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69" Apr 16 14:27:45.298238 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:27:45.298217 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69\": container with ID starting with 4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69 not found: ID does not exist" containerID="4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69" Apr 16 14:27:45.298337 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:45.298241 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69"} err="failed to get container status \"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69\": rpc error: code = NotFound desc = could not find container \"4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69\": container with ID starting with 4c9a69583064dce06c95c1d42483e8a46a38a955f26fbe7ef55365d575efce69 not found: ID does not exist" Apr 16 14:27:46.282566 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:46.282536 2574 generic.go:358] "Generic (PLEG): container finished" podID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerID="c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e" exitCode=0 Apr 16 14:27:46.282990 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:46.282619 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerDied","Data":"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e"} Apr 16 14:27:47.187105 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:47.187073 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" path="/var/lib/kubelet/pods/429fa2ff-f6c2-4941-9b07-88bb8d71504d/volumes" Apr 16 14:27:47.288159 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:47.288115 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerStarted","Data":"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2"} Apr 16 14:27:47.288635 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:47.288393 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:27:47.289892 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:47.289861 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:27:47.303217 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:47.303167 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podStartSLOduration=6.303150944 podStartE2EDuration="6.303150944s" podCreationTimestamp="2026-04-16 14:27:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:27:47.302412704 +0000 UTC m=+1708.663955471" watchObservedRunningTime="2026-04-16 14:27:47.303150944 +0000 UTC m=+1708.664693712" Apr 16 14:27:48.291161 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:48.291115 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:27:58.291300 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:27:58.291258 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:08.291474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:08.291427 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:18.291822 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:18.291769 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:28.291389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:28.291345 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:38.292116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:38.292071 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:48.291144 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:48.291093 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:28:58.291380 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:28:58.291331 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: connect: connection refused" Apr 16 14:29:06.184790 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:06.184754 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:29:12.391424 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.391387 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:29:12.391829 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.391683 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" containerID="cri-o://85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2" gracePeriod=30 Apr 16 14:29:12.499964 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.499928 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:29:12.500238 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.500227 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="storage-initializer" Apr 16 14:29:12.500287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.500241 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="storage-initializer" Apr 16 14:29:12.500287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.500251 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" Apr 16 14:29:12.500287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.500258 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" Apr 16 14:29:12.500387 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.500315 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="429fa2ff-f6c2-4941-9b07-88bb8d71504d" containerName="kserve-container" Apr 16 14:29:12.502946 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.502926 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:12.511898 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.511873 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:29:12.649417 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.649320 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg\" (UID: \"b320eded-5563-4037-ae7a-05c7b32c31f9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:12.750415 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.750374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg\" (UID: \"b320eded-5563-4037-ae7a-05c7b32c31f9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:12.750798 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.750778 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg\" (UID: \"b320eded-5563-4037-ae7a-05c7b32c31f9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:12.813245 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.813209 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:12.946854 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:12.946775 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:29:12.950663 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:29:12.950630 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb320eded_5563_4037_ae7a_05c7b32c31f9.slice/crio-16a4d633cd378407c76fd9b2423ec45393366d9c02f8f925aa79722137ba6539 WatchSource:0}: Error finding container 16a4d633cd378407c76fd9b2423ec45393366d9c02f8f925aa79722137ba6539: Status 404 returned error can't find the container with id 16a4d633cd378407c76fd9b2423ec45393366d9c02f8f925aa79722137ba6539 Apr 16 14:29:13.538426 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:13.538391 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerStarted","Data":"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1"} Apr 16 14:29:13.538426 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:13.538428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerStarted","Data":"16a4d633cd378407c76fd9b2423ec45393366d9c02f8f925aa79722137ba6539"} Apr 16 14:29:16.232517 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.232493 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:29:16.379241 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.379205 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location\") pod \"302acc2d-6069-4ecc-a4bf-d7e5275c0c03\" (UID: \"302acc2d-6069-4ecc-a4bf-d7e5275c0c03\") " Apr 16 14:29:16.379570 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.379541 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "302acc2d-6069-4ecc-a4bf-d7e5275c0c03" (UID: "302acc2d-6069-4ecc-a4bf-d7e5275c0c03"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:29:16.480411 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.480373 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/302acc2d-6069-4ecc-a4bf-d7e5275c0c03-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:29:16.549114 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.549080 2574 generic.go:358] "Generic (PLEG): container finished" podID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerID="85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2" exitCode=0 Apr 16 14:29:16.549290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.549154 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" Apr 16 14:29:16.549290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.549167 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerDied","Data":"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2"} Apr 16 14:29:16.549290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.549206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" event={"ID":"302acc2d-6069-4ecc-a4bf-d7e5275c0c03","Type":"ContainerDied","Data":"f178d3963952ff1c89c3cfd98175b09338ded67111a93bf9c41f81221acb24ed"} Apr 16 14:29:16.549290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.549222 2574 scope.go:117] "RemoveContainer" containerID="85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2" Apr 16 14:29:16.558672 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.558653 2574 scope.go:117] "RemoveContainer" containerID="c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e" Apr 16 14:29:16.566166 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.566147 2574 scope.go:117] "RemoveContainer" containerID="85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2" Apr 16 14:29:16.566432 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:29:16.566415 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2\": container with ID starting with 85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2 not found: ID does not exist" containerID="85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2" Apr 16 14:29:16.566479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.566441 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2"} err="failed to get container status \"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2\": rpc error: code = NotFound desc = could not find container \"85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2\": container with ID starting with 85d90757fea76f796105b25a5405bb3df1320d151c3b6d43026606f328b2cec2 not found: ID does not exist" Apr 16 14:29:16.566479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.566458 2574 scope.go:117] "RemoveContainer" containerID="c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e" Apr 16 14:29:16.566679 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:29:16.566661 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e\": container with ID starting with c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e not found: ID does not exist" containerID="c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e" Apr 16 14:29:16.566728 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.566685 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e"} err="failed to get container status \"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e\": rpc error: code = NotFound desc = could not find container \"c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e\": container with ID starting with c102354283e257399fb8af1d0aaf63716d9799e9d7b211388e8a6fff5dd5094e not found: ID does not exist" Apr 16 14:29:16.572994 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.572970 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:29:16.580738 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:16.580677 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b"] Apr 16 14:29:17.184704 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:17.184662 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-llq4b" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.39:8080: i/o timeout" Apr 16 14:29:17.187144 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:17.187121 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" path="/var/lib/kubelet/pods/302acc2d-6069-4ecc-a4bf-d7e5275c0c03/volumes" Apr 16 14:29:17.553906 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:17.553825 2574 generic.go:358] "Generic (PLEG): container finished" podID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerID="344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1" exitCode=0 Apr 16 14:29:17.553906 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:17.553891 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerDied","Data":"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1"} Apr 16 14:29:18.559593 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:18.559550 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerStarted","Data":"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942"} Apr 16 14:29:18.559993 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:18.559862 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:29:18.561237 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:18.561214 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:29:18.600075 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:18.600017 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podStartSLOduration=6.599997283 podStartE2EDuration="6.599997283s" podCreationTimestamp="2026-04-16 14:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:29:18.599732197 +0000 UTC m=+1799.961274965" watchObservedRunningTime="2026-04-16 14:29:18.599997283 +0000 UTC m=+1799.961540054" Apr 16 14:29:19.562773 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:19.562683 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:29:29.563521 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:29.563474 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:29:39.562717 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:39.562673 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:29:42.152226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:42.152194 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:29:42.153917 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:42.153899 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:29:49.563311 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:49.563266 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:29:59.563406 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:29:59.563350 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:30:09.563478 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:09.563428 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:30:19.563087 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:19.563043 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:30:29.563539 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:29.563491 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.40:8080: connect: connection refused" Apr 16 14:30:39.564217 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:39.564137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:30:43.503565 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.503529 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:30:43.503958 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.503807 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" containerID="cri-o://66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942" gracePeriod=30 Apr 16 14:30:43.599651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.599618 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:30:43.599943 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.599930 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" Apr 16 14:30:43.599987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.599946 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" Apr 16 14:30:43.599987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.599957 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="storage-initializer" Apr 16 14:30:43.599987 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.599963 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="storage-initializer" Apr 16 14:30:43.600081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.600023 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="302acc2d-6069-4ecc-a4bf-d7e5275c0c03" containerName="kserve-container" Apr 16 14:30:43.602938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.602919 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:43.611011 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.610984 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:30:43.730152 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.730103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location\") pod \"isvc-primary-a95785-predictor-6d466dc878-8wsz2\" (UID: \"4a043422-6632-47b6-8ef5-c26d042416c2\") " pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:43.831343 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.831298 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location\") pod \"isvc-primary-a95785-predictor-6d466dc878-8wsz2\" (UID: \"4a043422-6632-47b6-8ef5-c26d042416c2\") " pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:43.831688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.831667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location\") pod \"isvc-primary-a95785-predictor-6d466dc878-8wsz2\" (UID: \"4a043422-6632-47b6-8ef5-c26d042416c2\") " pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:43.915352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:43.915315 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:44.029271 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:44.029241 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:30:44.032843 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:30:44.032815 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a043422_6632_47b6_8ef5_c26d042416c2.slice/crio-5a545802ab69ef3b7d052e65d319658a5e9b8809aa3ef8be812f8c157d49281f WatchSource:0}: Error finding container 5a545802ab69ef3b7d052e65d319658a5e9b8809aa3ef8be812f8c157d49281f: Status 404 returned error can't find the container with id 5a545802ab69ef3b7d052e65d319658a5e9b8809aa3ef8be812f8c157d49281f Apr 16 14:30:44.810181 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:44.810142 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerStarted","Data":"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885"} Apr 16 14:30:44.810181 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:44.810182 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerStarted","Data":"5a545802ab69ef3b7d052e65d319658a5e9b8809aa3ef8be812f8c157d49281f"} Apr 16 14:30:47.144609 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.144568 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:30:47.258353 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.258271 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location\") pod \"b320eded-5563-4037-ae7a-05c7b32c31f9\" (UID: \"b320eded-5563-4037-ae7a-05c7b32c31f9\") " Apr 16 14:30:47.258548 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.258526 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b320eded-5563-4037-ae7a-05c7b32c31f9" (UID: "b320eded-5563-4037-ae7a-05c7b32c31f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:47.359672 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.359634 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b320eded-5563-4037-ae7a-05c7b32c31f9-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:30:47.821843 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.821806 2574 generic.go:358] "Generic (PLEG): container finished" podID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerID="66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942" exitCode=0 Apr 16 14:30:47.822048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.821897 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" Apr 16 14:30:47.822048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.821895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerDied","Data":"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942"} Apr 16 14:30:47.822048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.821943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg" event={"ID":"b320eded-5563-4037-ae7a-05c7b32c31f9","Type":"ContainerDied","Data":"16a4d633cd378407c76fd9b2423ec45393366d9c02f8f925aa79722137ba6539"} Apr 16 14:30:47.822048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.821962 2574 scope.go:117] "RemoveContainer" containerID="66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942" Apr 16 14:30:47.831827 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.831803 2574 scope.go:117] "RemoveContainer" containerID="344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1" Apr 16 14:30:47.839736 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.839714 2574 scope.go:117] "RemoveContainer" containerID="66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942" Apr 16 14:30:47.840031 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:30:47.840011 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942\": container with ID starting with 66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942 not found: ID does not exist" containerID="66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942" Apr 16 14:30:47.840088 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.840039 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942"} err="failed to get container status \"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942\": rpc error: code = NotFound desc = could not find container \"66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942\": container with ID starting with 66ccbf02e71da4a37ac6fd756fb90289ed3400a29606ddd1b44ee1b798958942 not found: ID does not exist" Apr 16 14:30:47.840088 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.840058 2574 scope.go:117] "RemoveContainer" containerID="344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1" Apr 16 14:30:47.840307 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:30:47.840288 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1\": container with ID starting with 344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1 not found: ID does not exist" containerID="344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1" Apr 16 14:30:47.840359 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.840311 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1"} err="failed to get container status \"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1\": rpc error: code = NotFound desc = could not find container \"344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1\": container with ID starting with 344b71b4f54df924647069391041488a8aa44ccce44f3d593e44d5c7f6ee46e1 not found: ID does not exist" Apr 16 14:30:47.843523 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.843499 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:30:47.848053 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:47.848030 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-6rdcg"] Apr 16 14:30:48.825894 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:48.825863 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a043422-6632-47b6-8ef5-c26d042416c2" containerID="6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885" exitCode=0 Apr 16 14:30:48.826324 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:48.825940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerDied","Data":"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885"} Apr 16 14:30:49.187219 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:49.187141 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" path="/var/lib/kubelet/pods/b320eded-5563-4037-ae7a-05c7b32c31f9/volumes" Apr 16 14:30:49.831452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:49.831417 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerStarted","Data":"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01"} Apr 16 14:30:49.831855 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:49.831752 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:30:49.833252 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:49.833224 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:30:49.847003 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:49.846911 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podStartSLOduration=6.846897188 podStartE2EDuration="6.846897188s" podCreationTimestamp="2026-04-16 14:30:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:49.845452849 +0000 UTC m=+1891.206995617" watchObservedRunningTime="2026-04-16 14:30:49.846897188 +0000 UTC m=+1891.208439954" Apr 16 14:30:50.834878 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:30:50.834840 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:00.835490 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:00.835445 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:10.835191 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:10.835142 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:20.834968 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:20.834923 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:30.835364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:30.835318 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:40.835140 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:40.835090 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:31:50.835411 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:31:50.835356 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.41:8080: connect: connection refused" Apr 16 14:32:00.836748 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:00.836709 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:32:03.732081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732049 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:03.732452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732365 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="storage-initializer" Apr 16 14:32:03.732452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732376 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="storage-initializer" Apr 16 14:32:03.732452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732390 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" Apr 16 14:32:03.732452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732396 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" Apr 16 14:32:03.732452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.732437 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b320eded-5563-4037-ae7a-05c7b32c31f9" containerName="kserve-container" Apr 16 14:32:03.735374 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.735359 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.737433 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.737405 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 14:32:03.737572 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.737413 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-a95785\"" Apr 16 14:32:03.737824 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.737808 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-a95785-dockercfg-p7p5c\"" Apr 16 14:32:03.748164 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.748143 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:03.799530 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.799496 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.799714 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.799555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.900758 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.900724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.900976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.900782 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.901156 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.901137 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:03.901444 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:03.901420 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert\") pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:04.045976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:04.045884 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:04.166652 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:04.166629 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:04.169340 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:32:04.169312 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca12f24e_d189_4e0a_9da0_cf099183a077.slice/crio-e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75 WatchSource:0}: Error finding container e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75: Status 404 returned error can't find the container with id e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75 Apr 16 14:32:04.171575 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:04.171558 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:32:05.045149 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:05.045112 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerStarted","Data":"4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419"} Apr 16 14:32:05.045149 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:05.045152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerStarted","Data":"e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75"} Apr 16 14:32:11.067713 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:11.067682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/0.log" Apr 16 14:32:11.068163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:11.067725 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerID="4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419" exitCode=1 Apr 16 14:32:11.068163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:11.067759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerDied","Data":"4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419"} Apr 16 14:32:12.071523 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:12.071494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/0.log" Apr 16 14:32:12.071922 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:12.071615 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerStarted","Data":"965ca2f9f858456c5dc0359c589371f3a9b3db28b5b47d11ff93c46b39789262"} Apr 16 14:32:15.082652 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.082621 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/1.log" Apr 16 14:32:15.083046 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.082988 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/0.log" Apr 16 14:32:15.083046 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.083020 2574 generic.go:358] "Generic (PLEG): container finished" podID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerID="965ca2f9f858456c5dc0359c589371f3a9b3db28b5b47d11ff93c46b39789262" exitCode=1 Apr 16 14:32:15.083120 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.083098 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerDied","Data":"965ca2f9f858456c5dc0359c589371f3a9b3db28b5b47d11ff93c46b39789262"} Apr 16 14:32:15.083161 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.083145 2574 scope.go:117] "RemoveContainer" containerID="4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419" Apr 16 14:32:15.083541 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:15.083519 2574 scope.go:117] "RemoveContainer" containerID="4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419" Apr 16 14:32:15.094148 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:15.094110 2574 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_kserve-ci-e2e-test_ca12f24e-d189-4e0a-9da0-cf099183a077_0 in pod sandbox e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75 from index: no such id: '4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419'" containerID="4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419" Apr 16 14:32:15.094224 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:15.094174 2574 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_kserve-ci-e2e-test_ca12f24e-d189-4e0a-9da0-cf099183a077_0 in pod sandbox e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75 from index: no such id: '4ca1e5afdab05b01a60c218df51a825d5ececfa40d3e227132642d28238a3419'; Skipping pod \"isvc-secondary-a95785-predictor-769d5749cf-lrgmq_kserve-ci-e2e-test(ca12f24e-d189-4e0a-9da0-cf099183a077)\"" logger="UnhandledError" Apr 16 14:32:15.095477 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:15.095460 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-a95785-predictor-769d5749cf-lrgmq_kserve-ci-e2e-test(ca12f24e-d189-4e0a-9da0-cf099183a077)\"" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" Apr 16 14:32:16.086861 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:16.086835 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/1.log" Apr 16 14:32:21.882316 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:21.882282 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:21.967823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:21.967782 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:32:21.968131 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:21.968078 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" containerID="cri-o://a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01" gracePeriod=30 Apr 16 14:32:22.005389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.005271 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:22.008754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.008735 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.010849 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.010820 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-b9647c-dockercfg-vsm86\"" Apr 16 14:32:22.010963 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.010911 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-b9647c\"" Apr 16 14:32:22.015318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.015295 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:22.017053 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.017033 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/1.log" Apr 16 14:32:22.017596 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.017557 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:22.049137 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049098 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location\") pod \"ca12f24e-d189-4e0a-9da0-cf099183a077\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " Apr 16 14:32:22.049339 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049183 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert\") pod \"ca12f24e-d189-4e0a-9da0-cf099183a077\" (UID: \"ca12f24e-d189-4e0a-9da0-cf099183a077\") " Apr 16 14:32:22.049339 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049321 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.049475 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049449 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.049636 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca12f24e-d189-4e0a-9da0-cf099183a077" (UID: "ca12f24e-d189-4e0a-9da0-cf099183a077"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:32:22.049725 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.049703 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ca12f24e-d189-4e0a-9da0-cf099183a077" (UID: "ca12f24e-d189-4e0a-9da0-cf099183a077"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:32:22.104427 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.104402 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-a95785-predictor-769d5749cf-lrgmq_ca12f24e-d189-4e0a-9da0-cf099183a077/storage-initializer/1.log" Apr 16 14:32:22.104612 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.104518 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" Apr 16 14:32:22.104612 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.104496 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq" event={"ID":"ca12f24e-d189-4e0a-9da0-cf099183a077","Type":"ContainerDied","Data":"e6f71d1c0dd691d820c32239d6b663c39c3992ed8c9aa57ae4d411c657564d75"} Apr 16 14:32:22.104612 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.104558 2574 scope.go:117] "RemoveContainer" containerID="965ca2f9f858456c5dc0359c589371f3a9b3db28b5b47d11ff93c46b39789262" Apr 16 14:32:22.139122 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.139042 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:22.143253 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.143226 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-a95785-predictor-769d5749cf-lrgmq"] Apr 16 14:32:22.150848 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.150825 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.150954 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.150880 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.150954 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.150930 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca12f24e-d189-4e0a-9da0-cf099183a077-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:32:22.150954 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.150940 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca12f24e-d189-4e0a-9da0-cf099183a077-cabundle-cert\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:32:22.151298 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.151279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.151415 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.151397 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert\") pod \"isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.326448 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.326412 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:22.448615 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:22.448567 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:22.452507 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:32:22.452481 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8f6f7ff_13b3_4dc9_b977_ba43eb35cb00.slice/crio-9ef1ced8d1d6306bd90027fbd6dff794adec2163a9b42e3ef734da8fa1a73401 WatchSource:0}: Error finding container 9ef1ced8d1d6306bd90027fbd6dff794adec2163a9b42e3ef734da8fa1a73401: Status 404 returned error can't find the container with id 9ef1ced8d1d6306bd90027fbd6dff794adec2163a9b42e3ef734da8fa1a73401 Apr 16 14:32:23.109442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:23.109403 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerStarted","Data":"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa"} Apr 16 14:32:23.109442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:23.109451 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerStarted","Data":"9ef1ced8d1d6306bd90027fbd6dff794adec2163a9b42e3ef734da8fa1a73401"} Apr 16 14:32:23.187484 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:23.187448 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" path="/var/lib/kubelet/pods/ca12f24e-d189-4e0a-9da0-cf099183a077/volumes" Apr 16 14:32:26.410163 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:26.410138 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:32:26.482257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:26.482207 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location\") pod \"4a043422-6632-47b6-8ef5-c26d042416c2\" (UID: \"4a043422-6632-47b6-8ef5-c26d042416c2\") " Apr 16 14:32:26.482474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:26.482448 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4a043422-6632-47b6-8ef5-c26d042416c2" (UID: "4a043422-6632-47b6-8ef5-c26d042416c2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:32:26.583041 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:26.583006 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4a043422-6632-47b6-8ef5-c26d042416c2-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:32:27.123476 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.123440 2574 generic.go:358] "Generic (PLEG): container finished" podID="4a043422-6632-47b6-8ef5-c26d042416c2" containerID="a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01" exitCode=0 Apr 16 14:32:27.123670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.123476 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerDied","Data":"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01"} Apr 16 14:32:27.123670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.123510 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" Apr 16 14:32:27.123670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.123529 2574 scope.go:117] "RemoveContainer" containerID="a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01" Apr 16 14:32:27.123670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.123518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2" event={"ID":"4a043422-6632-47b6-8ef5-c26d042416c2","Type":"ContainerDied","Data":"5a545802ab69ef3b7d052e65d319658a5e9b8809aa3ef8be812f8c157d49281f"} Apr 16 14:32:27.131814 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.131794 2574 scope.go:117] "RemoveContainer" containerID="6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885" Apr 16 14:32:27.139120 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.139103 2574 scope.go:117] "RemoveContainer" containerID="a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01" Apr 16 14:32:27.139399 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:27.139378 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01\": container with ID starting with a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01 not found: ID does not exist" containerID="a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01" Apr 16 14:32:27.139491 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.139406 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01"} err="failed to get container status \"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01\": rpc error: code = NotFound desc = could not find container \"a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01\": container with ID starting with a78721d6755aecffe5c12cbe03387dae7f6fec4d6dc4de59ce8a709d0fa1fc01 not found: ID does not exist" Apr 16 14:32:27.139491 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.139424 2574 scope.go:117] "RemoveContainer" containerID="6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885" Apr 16 14:32:27.139713 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:27.139696 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885\": container with ID starting with 6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885 not found: ID does not exist" containerID="6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885" Apr 16 14:32:27.139762 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.139719 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885"} err="failed to get container status \"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885\": rpc error: code = NotFound desc = could not find container \"6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885\": container with ID starting with 6dcf6e7ced3f6d0028faacd5cfa6b242344a07a10188ddbab7e3597c6099e885 not found: ID does not exist" Apr 16 14:32:27.143571 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.143518 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:32:27.145224 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.145203 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-a95785-predictor-6d466dc878-8wsz2"] Apr 16 14:32:27.187331 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:27.187303 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" path="/var/lib/kubelet/pods/4a043422-6632-47b6-8ef5-c26d042416c2/volumes" Apr 16 14:32:29.133959 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:29.133925 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/0.log" Apr 16 14:32:29.134321 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:29.133964 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerID="b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa" exitCode=1 Apr 16 14:32:29.134321 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:29.134018 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerDied","Data":"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa"} Apr 16 14:32:30.138589 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:30.138560 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/0.log" Apr 16 14:32:30.138960 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:30.138686 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerStarted","Data":"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a"} Apr 16 14:32:31.982470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:31.982438 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:31.982851 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:31.982679 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" containerID="cri-o://f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a" gracePeriod=30 Apr 16 14:32:32.111966 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.111925 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:32:32.112366 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112348 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" Apr 16 14:32:32.112366 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112368 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112383 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112390 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112399 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112405 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112413 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112418 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112469 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112475 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca12f24e-d189-4e0a-9da0-cf099183a077" containerName="storage-initializer" Apr 16 14:32:32.112726 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.112482 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4a043422-6632-47b6-8ef5-c26d042416c2" containerName="kserve-container" Apr 16 14:32:32.116890 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.116871 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:32.119369 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.119349 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdtpg\"" Apr 16 14:32:32.138634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.138606 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:32:32.227730 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.227685 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-f2tdt\" (UID: \"5e9920ed-9f6e-4638-a10d-27ad62c29cdc\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:32.328944 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.328917 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-f2tdt\" (UID: \"5e9920ed-9f6e-4638-a10d-27ad62c29cdc\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:32.329249 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.329233 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-f2tdt\" (UID: \"5e9920ed-9f6e-4638-a10d-27ad62c29cdc\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:32.427389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.427352 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:32.549427 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.549391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:32:32.553250 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:32:32.553212 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e9920ed_9f6e_4638_a10d_27ad62c29cdc.slice/crio-f0654c27a9a41d6e935ca2cc3a46616677c710cfc8b7d188c4d6fccb2d88a47a WatchSource:0}: Error finding container f0654c27a9a41d6e935ca2cc3a46616677c710cfc8b7d188c4d6fccb2d88a47a: Status 404 returned error can't find the container with id f0654c27a9a41d6e935ca2cc3a46616677c710cfc8b7d188c4d6fccb2d88a47a Apr 16 14:32:32.809201 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.809173 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/1.log" Apr 16 14:32:32.809614 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.809592 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/0.log" Apr 16 14:32:32.809723 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.809664 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:32.935028 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.934941 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert\") pod \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " Apr 16 14:32:32.935173 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.935030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location\") pod \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\" (UID: \"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00\") " Apr 16 14:32:32.935297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.935271 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" (UID: "f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:32:32.935334 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:32.935289 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" (UID: "f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:32:33.036208 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.036169 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:32:33.036208 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.036209 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00-cabundle-cert\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:32:33.149593 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.149563 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/1.log" Apr 16 14:32:33.150006 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.149988 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69_f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/storage-initializer/0.log" Apr 16 14:32:33.150081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.150025 2574 generic.go:358] "Generic (PLEG): container finished" podID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerID="f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a" exitCode=1 Apr 16 14:32:33.150122 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.150096 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerDied","Data":"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a"} Apr 16 14:32:33.150193 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.150128 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" event={"ID":"f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00","Type":"ContainerDied","Data":"9ef1ced8d1d6306bd90027fbd6dff794adec2163a9b42e3ef734da8fa1a73401"} Apr 16 14:32:33.150193 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.150101 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69" Apr 16 14:32:33.150297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.150145 2574 scope.go:117] "RemoveContainer" containerID="f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a" Apr 16 14:32:33.151698 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.151675 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerStarted","Data":"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7"} Apr 16 14:32:33.151775 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.151707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerStarted","Data":"f0654c27a9a41d6e935ca2cc3a46616677c710cfc8b7d188c4d6fccb2d88a47a"} Apr 16 14:32:33.159460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.159439 2574 scope.go:117] "RemoveContainer" containerID="b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa" Apr 16 14:32:33.166851 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.166830 2574 scope.go:117] "RemoveContainer" containerID="f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a" Apr 16 14:32:33.167127 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:33.167105 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a\": container with ID starting with f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a not found: ID does not exist" containerID="f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a" Apr 16 14:32:33.167227 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.167136 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a"} err="failed to get container status \"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a\": rpc error: code = NotFound desc = could not find container \"f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a\": container with ID starting with f5223bd8f7b5849592e86dc7c54d3bd529366d33db9f37cde1b4cb50685ee61a not found: ID does not exist" Apr 16 14:32:33.167227 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.167156 2574 scope.go:117] "RemoveContainer" containerID="b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa" Apr 16 14:32:33.167410 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:32:33.167396 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa\": container with ID starting with b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa not found: ID does not exist" containerID="b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa" Apr 16 14:32:33.167452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.167412 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa"} err="failed to get container status \"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa\": rpc error: code = NotFound desc = could not find container \"b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa\": container with ID starting with b079f1ae8f40c3d5d58b12e2defb5455f431985b0db13cf5104bacc4925deeaa not found: ID does not exist" Apr 16 14:32:33.192391 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.192316 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:33.196539 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:33.196513 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-b9647c-predictor-868dfcfb77-4wf69"] Apr 16 14:32:35.186963 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:35.186927 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" path="/var/lib/kubelet/pods/f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00/volumes" Apr 16 14:32:37.165660 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:37.165631 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerID="cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7" exitCode=0 Apr 16 14:32:37.166045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:37.165692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerDied","Data":"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7"} Apr 16 14:32:57.238481 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:57.238444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerStarted","Data":"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4"} Apr 16 14:32:57.238958 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:57.238852 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:32:57.240131 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:57.240105 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:32:57.254258 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:57.254210 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podStartSLOduration=5.69530782 podStartE2EDuration="25.254197032s" podCreationTimestamp="2026-04-16 14:32:32 +0000 UTC" firstStartedPulling="2026-04-16 14:32:37.166797762 +0000 UTC m=+1998.528340509" lastFinishedPulling="2026-04-16 14:32:56.725686961 +0000 UTC m=+2018.087229721" observedRunningTime="2026-04-16 14:32:57.252509557 +0000 UTC m=+2018.614052321" watchObservedRunningTime="2026-04-16 14:32:57.254197032 +0000 UTC m=+2018.615739860" Apr 16 14:32:58.241775 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:32:58.241739 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:08.242522 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:08.242475 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:18.242107 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:18.242065 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:28.241816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:28.241768 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:38.242194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:38.242096 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:48.241915 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:48.241870 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:33:58.242506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:33:58.242463 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:34:08.242525 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:08.242473 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.44:8080: connect: connection refused" Apr 16 14:34:18.243779 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:18.243750 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:34:22.250467 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.250437 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:34:22.250839 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.250730 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" containerID="cri-o://32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4" gracePeriod=30 Apr 16 14:34:22.339872 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.339836 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:34:22.340150 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340138 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.340194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340152 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.340194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340167 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.340194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340173 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.340330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340231 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.340330 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.340239 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8f6f7ff-13b3-4dc9-b977-ba43eb35cb00" containerName="storage-initializer" Apr 16 14:34:22.343157 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.343135 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:22.351864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.351842 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:34:22.419160 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.419116 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nps8n\" (UID: \"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:22.520454 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.520367 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nps8n\" (UID: \"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:22.520757 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.520737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-nps8n\" (UID: \"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:22.653391 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.653353 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:22.775697 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:22.775618 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:34:22.779051 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:34:22.779019 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29fb0dc9_9bde_45dd_aeaf_bbdeba105f33.slice/crio-da44ec6492ca1634a7a510f7cfc8bcaa19cf4832608d8d506b33e0923c57db26 WatchSource:0}: Error finding container da44ec6492ca1634a7a510f7cfc8bcaa19cf4832608d8d506b33e0923c57db26: Status 404 returned error can't find the container with id da44ec6492ca1634a7a510f7cfc8bcaa19cf4832608d8d506b33e0923c57db26 Apr 16 14:34:23.483515 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:23.483477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerStarted","Data":"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045"} Apr 16 14:34:23.483515 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:23.483518 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerStarted","Data":"da44ec6492ca1634a7a510f7cfc8bcaa19cf4832608d8d506b33e0923c57db26"} Apr 16 14:34:26.494176 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:26.494133 2574 generic.go:358] "Generic (PLEG): container finished" podID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerID="120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045" exitCode=0 Apr 16 14:34:26.494576 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:26.494205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerDied","Data":"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045"} Apr 16 14:34:27.092262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.092238 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:34:27.158067 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.158033 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location\") pod \"5e9920ed-9f6e-4638-a10d-27ad62c29cdc\" (UID: \"5e9920ed-9f6e-4638-a10d-27ad62c29cdc\") " Apr 16 14:34:27.158351 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.158327 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5e9920ed-9f6e-4638-a10d-27ad62c29cdc" (UID: "5e9920ed-9f6e-4638-a10d-27ad62c29cdc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:34:27.259189 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.259153 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5e9920ed-9f6e-4638-a10d-27ad62c29cdc-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:34:27.498862 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.498753 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerStarted","Data":"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b"} Apr 16 14:34:27.499278 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.499080 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:34:27.500212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500189 2574 generic.go:358] "Generic (PLEG): container finished" podID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerID="32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4" exitCode=0 Apr 16 14:34:27.500340 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500243 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerDied","Data":"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4"} Apr 16 14:34:27.500340 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500258 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" Apr 16 14:34:27.500340 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500273 2574 scope.go:117] "RemoveContainer" containerID="32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4" Apr 16 14:34:27.500499 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500264 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt" event={"ID":"5e9920ed-9f6e-4638-a10d-27ad62c29cdc","Type":"ContainerDied","Data":"f0654c27a9a41d6e935ca2cc3a46616677c710cfc8b7d188c4d6fccb2d88a47a"} Apr 16 14:34:27.500734 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.500710 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:34:27.507882 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.507855 2574 scope.go:117] "RemoveContainer" containerID="cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7" Apr 16 14:34:27.514833 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.514814 2574 scope.go:117] "RemoveContainer" containerID="32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4" Apr 16 14:34:27.515087 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:34:27.515069 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4\": container with ID starting with 32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4 not found: ID does not exist" containerID="32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4" Apr 16 14:34:27.515135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.515096 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4"} err="failed to get container status \"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4\": rpc error: code = NotFound desc = could not find container \"32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4\": container with ID starting with 32171e2c6a77aa8f5ad76090dff5e5f9c623897bf4c39bf9cdafa511dc712cf4 not found: ID does not exist" Apr 16 14:34:27.515135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.515114 2574 scope.go:117] "RemoveContainer" containerID="cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7" Apr 16 14:34:27.515355 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:34:27.515338 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7\": container with ID starting with cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7 not found: ID does not exist" containerID="cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7" Apr 16 14:34:27.515398 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.515365 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7"} err="failed to get container status \"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7\": rpc error: code = NotFound desc = could not find container \"cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7\": container with ID starting with cea74294f3ae232aef472c98e36d8db7dfe0896bbbc09deb2bf366e7ead118c7 not found: ID does not exist" Apr 16 14:34:27.517260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.517073 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podStartSLOduration=5.517059409 podStartE2EDuration="5.517059409s" podCreationTimestamp="2026-04-16 14:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:34:27.516831036 +0000 UTC m=+2108.878373804" watchObservedRunningTime="2026-04-16 14:34:27.517059409 +0000 UTC m=+2108.878602178" Apr 16 14:34:27.529648 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.529620 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:34:27.533233 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:27.533208 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-f2tdt"] Apr 16 14:34:28.509885 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:28.509851 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:34:29.187008 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:29.186979 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" path="/var/lib/kubelet/pods/5e9920ed-9f6e-4638-a10d-27ad62c29cdc/volumes" Apr 16 14:34:38.510053 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:38.510003 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:34:42.174824 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:42.174793 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:34:42.177360 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:42.177338 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:34:48.509937 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:48.509897 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:34:58.510558 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:34:58.510516 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:08.510160 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:08.510114 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:18.510492 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:18.510446 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:28.510881 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:28.510834 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:38.510650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:38.510605 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:46.184467 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:46.184436 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:35:52.471031 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.470997 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:35:52.471408 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.471309 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" containerID="cri-o://ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b" gracePeriod=30 Apr 16 14:35:52.571037 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.570999 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:35:52.571323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.571309 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" Apr 16 14:35:52.571374 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.571325 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" Apr 16 14:35:52.571374 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.571345 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="storage-initializer" Apr 16 14:35:52.571374 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.571350 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="storage-initializer" Apr 16 14:35:52.571474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.571399 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5e9920ed-9f6e-4638-a10d-27ad62c29cdc" containerName="kserve-container" Apr 16 14:35:52.574434 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.574414 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:52.582199 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.582173 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:35:52.625034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.624999 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4qg25\" (UID: \"03d07a4a-a61f-41fe-9a55-66aeebd58364\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:52.726073 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.725978 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4qg25\" (UID: \"03d07a4a-a61f-41fe-9a55-66aeebd58364\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:52.726389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.726367 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4qg25\" (UID: \"03d07a4a-a61f-41fe-9a55-66aeebd58364\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:52.885738 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:52.885703 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:53.005119 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:53.005090 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:35:53.007523 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:35:53.007491 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03d07a4a_a61f_41fe_9a55_66aeebd58364.slice/crio-2d003e69de2a25970698c96ac6815a347f39a0a35cda2ab7c6800901c31d2363 WatchSource:0}: Error finding container 2d003e69de2a25970698c96ac6815a347f39a0a35cda2ab7c6800901c31d2363: Status 404 returned error can't find the container with id 2d003e69de2a25970698c96ac6815a347f39a0a35cda2ab7c6800901c31d2363 Apr 16 14:35:53.754749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:53.754715 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerStarted","Data":"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36"} Apr 16 14:35:53.754749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:53.754749 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerStarted","Data":"2d003e69de2a25970698c96ac6815a347f39a0a35cda2ab7c6800901c31d2363"} Apr 16 14:35:56.184051 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:56.184007 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.45:8080: connect: connection refused" Apr 16 14:35:57.313903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.313881 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:35:57.361998 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.361963 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location\") pod \"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33\" (UID: \"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33\") " Apr 16 14:35:57.362315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.362288 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" (UID: "29fb0dc9-9bde-45dd-aeaf-bbdeba105f33"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:35:57.463323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.463218 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:35:57.767180 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.767097 2574 generic.go:358] "Generic (PLEG): container finished" podID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerID="ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36" exitCode=0 Apr 16 14:35:57.767180 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.767172 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerDied","Data":"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36"} Apr 16 14:35:57.768546 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.768518 2574 generic.go:358] "Generic (PLEG): container finished" podID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerID="ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b" exitCode=0 Apr 16 14:35:57.768689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.768546 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerDied","Data":"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b"} Apr 16 14:35:57.768689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.768574 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" Apr 16 14:35:57.768689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.768605 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n" event={"ID":"29fb0dc9-9bde-45dd-aeaf-bbdeba105f33","Type":"ContainerDied","Data":"da44ec6492ca1634a7a510f7cfc8bcaa19cf4832608d8d506b33e0923c57db26"} Apr 16 14:35:57.768689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.768623 2574 scope.go:117] "RemoveContainer" containerID="ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b" Apr 16 14:35:57.776210 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.776193 2574 scope.go:117] "RemoveContainer" containerID="120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045" Apr 16 14:35:57.783634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.783610 2574 scope.go:117] "RemoveContainer" containerID="ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b" Apr 16 14:35:57.784137 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:35:57.784116 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b\": container with ID starting with ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b not found: ID does not exist" containerID="ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b" Apr 16 14:35:57.784217 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.784144 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b"} err="failed to get container status \"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b\": rpc error: code = NotFound desc = could not find container \"ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b\": container with ID starting with ace5f3752dbe89c086d62209c23dac31265c50a118cb21c614f93bd2b289e73b not found: ID does not exist" Apr 16 14:35:57.784217 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.784163 2574 scope.go:117] "RemoveContainer" containerID="120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045" Apr 16 14:35:57.784402 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:35:57.784381 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045\": container with ID starting with 120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045 not found: ID does not exist" containerID="120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045" Apr 16 14:35:57.784490 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.784404 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045"} err="failed to get container status \"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045\": rpc error: code = NotFound desc = could not find container \"120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045\": container with ID starting with 120a463fd2e06c38c976498eb5744756c4e084b5ea583df629a3d6c5b0b8e045 not found: ID does not exist" Apr 16 14:35:57.796690 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.796666 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:35:57.801336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:57.801312 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-nps8n"] Apr 16 14:35:58.773969 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:58.773934 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerStarted","Data":"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d"} Apr 16 14:35:58.774384 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:58.774200 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:35:58.775638 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:58.775609 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:35:58.790017 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:58.789974 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podStartSLOduration=6.789960967 podStartE2EDuration="6.789960967s" podCreationTimestamp="2026-04-16 14:35:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:35:58.787933961 +0000 UTC m=+2200.149476730" watchObservedRunningTime="2026-04-16 14:35:58.789960967 +0000 UTC m=+2200.151503737" Apr 16 14:35:59.188613 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:59.188480 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" path="/var/lib/kubelet/pods/29fb0dc9-9bde-45dd-aeaf-bbdeba105f33/volumes" Apr 16 14:35:59.777035 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:35:59.776996 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:09.777305 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:09.777258 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:19.777767 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:19.777722 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:29.777235 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:29.777188 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:39.777827 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:39.777744 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:49.777550 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:49.777505 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:36:59.777056 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:36:59.777004 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:37:09.777661 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:09.777615 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:37:15.187440 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:15.187409 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:37:22.729314 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.729274 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:37:22.729735 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.729531 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" containerID="cri-o://bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d" gracePeriod=30 Apr 16 14:37:22.842916 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.842881 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:37:22.843333 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.843319 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" Apr 16 14:37:22.843384 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.843337 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" Apr 16 14:37:22.843384 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.843359 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="storage-initializer" Apr 16 14:37:22.843384 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.843368 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="storage-initializer" Apr 16 14:37:22.843492 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.843436 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="29fb0dc9-9bde-45dd-aeaf-bbdeba105f33" containerName="kserve-container" Apr 16 14:37:22.846571 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.846550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:22.855810 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.855781 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:37:22.957470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:22.957433 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp\" (UID: \"58784149-3c35-4341-87c1-ce711b04c3b1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:23.058837 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:23.058749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp\" (UID: \"58784149-3c35-4341-87c1-ce711b04c3b1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:23.059148 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:23.059126 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp\" (UID: \"58784149-3c35-4341-87c1-ce711b04c3b1\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:23.158461 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:23.158425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:23.286566 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:23.285948 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:37:23.287909 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:23.287888 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:37:24.026593 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:24.026546 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerStarted","Data":"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e"} Apr 16 14:37:24.026980 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:24.026604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerStarted","Data":"c298868e73df970d16c87579b4fc161b72648d6a74697fdd4cb6eab10e3a85c6"} Apr 16 14:37:25.184227 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:25.184184 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.46:8080: connect: connection refused" Apr 16 14:37:27.035981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.035947 2574 generic.go:358] "Generic (PLEG): container finished" podID="58784149-3c35-4341-87c1-ce711b04c3b1" containerID="ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e" exitCode=0 Apr 16 14:37:27.035981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.035976 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerDied","Data":"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e"} Apr 16 14:37:27.683047 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.683022 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:37:27.798554 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.798458 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location\") pod \"03d07a4a-a61f-41fe-9a55-66aeebd58364\" (UID: \"03d07a4a-a61f-41fe-9a55-66aeebd58364\") " Apr 16 14:37:27.798822 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.798800 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "03d07a4a-a61f-41fe-9a55-66aeebd58364" (UID: "03d07a4a-a61f-41fe-9a55-66aeebd58364"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:37:27.899939 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:27.899899 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/03d07a4a-a61f-41fe-9a55-66aeebd58364-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:37:28.039915 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.039882 2574 generic.go:358] "Generic (PLEG): container finished" podID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerID="bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d" exitCode=0 Apr 16 14:37:28.040352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.039952 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" Apr 16 14:37:28.040352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.039961 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerDied","Data":"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d"} Apr 16 14:37:28.040352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.039996 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25" event={"ID":"03d07a4a-a61f-41fe-9a55-66aeebd58364","Type":"ContainerDied","Data":"2d003e69de2a25970698c96ac6815a347f39a0a35cda2ab7c6800901c31d2363"} Apr 16 14:37:28.040352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.040017 2574 scope.go:117] "RemoveContainer" containerID="bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d" Apr 16 14:37:28.041656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.041635 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerStarted","Data":"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26"} Apr 16 14:37:28.041864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.041846 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:37:28.048018 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.047993 2574 scope.go:117] "RemoveContainer" containerID="ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36" Apr 16 14:37:28.055010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.054994 2574 scope.go:117] "RemoveContainer" containerID="bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d" Apr 16 14:37:28.055287 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:37:28.055268 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d\": container with ID starting with bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d not found: ID does not exist" containerID="bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d" Apr 16 14:37:28.055336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.055295 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d"} err="failed to get container status \"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d\": rpc error: code = NotFound desc = could not find container \"bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d\": container with ID starting with bba23f59c153bbe1ae77d2dbe2c2039149e9e5dbcb4cedd4d3568b5eab02ef4d not found: ID does not exist" Apr 16 14:37:28.055336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.055313 2574 scope.go:117] "RemoveContainer" containerID="ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36" Apr 16 14:37:28.055545 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:37:28.055526 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36\": container with ID starting with ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36 not found: ID does not exist" containerID="ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36" Apr 16 14:37:28.055614 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.055552 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36"} err="failed to get container status \"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36\": rpc error: code = NotFound desc = could not find container \"ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36\": container with ID starting with ba3953bc4239e9a8022c381a56ade2be15a972ddf29e815d51e8132384398a36 not found: ID does not exist" Apr 16 14:37:28.061006 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.060962 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podStartSLOduration=6.060947639 podStartE2EDuration="6.060947639s" podCreationTimestamp="2026-04-16 14:37:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:37:28.058836724 +0000 UTC m=+2289.420379492" watchObservedRunningTime="2026-04-16 14:37:28.060947639 +0000 UTC m=+2289.422490408" Apr 16 14:37:28.070642 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.070621 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:37:28.072442 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:28.072420 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4qg25"] Apr 16 14:37:29.186885 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:29.186846 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" path="/var/lib/kubelet/pods/03d07a4a-a61f-41fe-9a55-66aeebd58364/volumes" Apr 16 14:37:59.047675 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:37:59.047637 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 14:38:09.046378 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:09.046332 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 14:38:19.046273 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:19.046224 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 14:38:29.046183 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:29.046141 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 14:38:39.046853 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:39.046812 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.47:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.133.0.47:8080: connect: connection refused" Apr 16 14:38:48.187158 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:48.187125 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:38:52.943799 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:52.943760 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:38:52.944267 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:52.944120 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" containerID="cri-o://1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26" gracePeriod=30 Apr 16 14:38:53.020743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.020711 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:38:53.021021 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.021007 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="storage-initializer" Apr 16 14:38:53.021084 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.021022 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="storage-initializer" Apr 16 14:38:53.021084 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.021034 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" Apr 16 14:38:53.021084 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.021041 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" Apr 16 14:38:53.021260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.021090 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="03d07a4a-a61f-41fe-9a55-66aeebd58364" containerName="kserve-container" Apr 16 14:38:53.024040 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.024018 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:53.031122 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.031093 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:38:53.176862 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.176824 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9\" (UID: \"5a183179-7ec7-4316-9f6f-69ed5bb59961\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:53.277494 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.277386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9\" (UID: \"5a183179-7ec7-4316-9f6f-69ed5bb59961\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:53.277860 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.277837 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9\" (UID: \"5a183179-7ec7-4316-9f6f-69ed5bb59961\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:53.336065 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.336026 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:53.455032 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:53.454998 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:38:53.458064 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:38:53.458027 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a183179_7ec7_4316_9f6f_69ed5bb59961.slice/crio-48753f1c71fee0be37042b257ace1c96b600de038dd1686e1a26dbbe91eda054 WatchSource:0}: Error finding container 48753f1c71fee0be37042b257ace1c96b600de038dd1686e1a26dbbe91eda054: Status 404 returned error can't find the container with id 48753f1c71fee0be37042b257ace1c96b600de038dd1686e1a26dbbe91eda054 Apr 16 14:38:54.289083 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:54.289047 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerStarted","Data":"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35"} Apr 16 14:38:54.289083 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:54.289084 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerStarted","Data":"48753f1c71fee0be37042b257ace1c96b600de038dd1686e1a26dbbe91eda054"} Apr 16 14:38:57.791877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:57.791853 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:38:57.918193 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:57.918092 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location\") pod \"58784149-3c35-4341-87c1-ce711b04c3b1\" (UID: \"58784149-3c35-4341-87c1-ce711b04c3b1\") " Apr 16 14:38:57.918451 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:57.918424 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58784149-3c35-4341-87c1-ce711b04c3b1" (UID: "58784149-3c35-4341-87c1-ce711b04c3b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:38:58.018918 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.018881 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58784149-3c35-4341-87c1-ce711b04c3b1-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:38:58.303205 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.303171 2574 generic.go:358] "Generic (PLEG): container finished" podID="58784149-3c35-4341-87c1-ce711b04c3b1" containerID="1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26" exitCode=0 Apr 16 14:38:58.303401 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.303245 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" Apr 16 14:38:58.303401 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.303254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerDied","Data":"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26"} Apr 16 14:38:58.303401 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.303300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp" event={"ID":"58784149-3c35-4341-87c1-ce711b04c3b1","Type":"ContainerDied","Data":"c298868e73df970d16c87579b4fc161b72648d6a74697fdd4cb6eab10e3a85c6"} Apr 16 14:38:58.303401 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.303323 2574 scope.go:117] "RemoveContainer" containerID="1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26" Apr 16 14:38:58.304828 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.304811 2574 generic.go:358] "Generic (PLEG): container finished" podID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerID="d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35" exitCode=0 Apr 16 14:38:58.304905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.304848 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerDied","Data":"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35"} Apr 16 14:38:58.311837 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.311818 2574 scope.go:117] "RemoveContainer" containerID="ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e" Apr 16 14:38:58.318743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.318722 2574 scope.go:117] "RemoveContainer" containerID="1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26" Apr 16 14:38:58.319004 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:38:58.318980 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26\": container with ID starting with 1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26 not found: ID does not exist" containerID="1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26" Apr 16 14:38:58.319050 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.319015 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26"} err="failed to get container status \"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26\": rpc error: code = NotFound desc = could not find container \"1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26\": container with ID starting with 1060a52029e2b5e97626e98facc3752bef1f7d43f625b1276cce3af313351a26 not found: ID does not exist" Apr 16 14:38:58.319050 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.319038 2574 scope.go:117] "RemoveContainer" containerID="ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e" Apr 16 14:38:58.319253 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:38:58.319235 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e\": container with ID starting with ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e not found: ID does not exist" containerID="ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e" Apr 16 14:38:58.319294 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.319261 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e"} err="failed to get container status \"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e\": rpc error: code = NotFound desc = could not find container \"ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e\": container with ID starting with ae8429e74640c8667cce0fc2de0a79b240ac4d45d5f401f606b5e805e14cf85e not found: ID does not exist" Apr 16 14:38:58.343904 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.343880 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:38:58.347981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:58.347956 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-bxzsp"] Apr 16 14:38:59.188052 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:59.188023 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" path="/var/lib/kubelet/pods/58784149-3c35-4341-87c1-ce711b04c3b1/volumes" Apr 16 14:38:59.309786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:59.309751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerStarted","Data":"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe"} Apr 16 14:38:59.310021 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:59.309987 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:38:59.332387 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:38:59.332339 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podStartSLOduration=6.332325095 podStartE2EDuration="6.332325095s" podCreationTimestamp="2026-04-16 14:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:38:59.331242142 +0000 UTC m=+2380.692784909" watchObservedRunningTime="2026-04-16 14:38:59.332325095 +0000 UTC m=+2380.693867927" Apr 16 14:39:30.315261 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:39:30.315219 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 14:39:40.315226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:39:40.315135 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 14:39:42.195392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:39:42.195364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:39:42.197542 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:39:42.197515 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:39:50.314565 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:39:50.314518 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 14:40:00.315222 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:00.315174 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 14:40:10.314934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:10.314884 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.48:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.133.0.48:8080: connect: connection refused" Apr 16 14:40:20.318557 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:20.318522 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:40:23.146353 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.146319 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:40:23.146785 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.146564 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" containerID="cri-o://aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe" gracePeriod=30 Apr 16 14:40:23.233364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233331 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:40:23.233670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233657 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" Apr 16 14:40:23.233670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233671 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" Apr 16 14:40:23.233754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233680 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="storage-initializer" Apr 16 14:40:23.233754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233686 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="storage-initializer" Apr 16 14:40:23.233754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.233739 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="58784149-3c35-4341-87c1-ce711b04c3b1" containerName="kserve-container" Apr 16 14:40:23.236574 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.236555 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:23.244479 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.244450 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:40:23.274314 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.274269 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5\" (UID: \"3484c28b-960a-4e3a-be84-e0270f6409fb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:23.375058 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.375007 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5\" (UID: \"3484c28b-960a-4e3a-be84-e0270f6409fb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:23.375395 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.375372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5\" (UID: \"3484c28b-960a-4e3a-be84-e0270f6409fb\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:23.547617 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.547496 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:23.665204 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:23.665169 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:40:23.668295 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:40:23.668263 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3484c28b_960a_4e3a_be84_e0270f6409fb.slice/crio-6fd3eb8c0bcf2d58dea5b2493d7af77f4b87699bbb249f4d3a1e912a59de444d WatchSource:0}: Error finding container 6fd3eb8c0bcf2d58dea5b2493d7af77f4b87699bbb249f4d3a1e912a59de444d: Status 404 returned error can't find the container with id 6fd3eb8c0bcf2d58dea5b2493d7af77f4b87699bbb249f4d3a1e912a59de444d Apr 16 14:40:24.563474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:24.563438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerStarted","Data":"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577"} Apr 16 14:40:24.563474 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:24.563477 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerStarted","Data":"6fd3eb8c0bcf2d58dea5b2493d7af77f4b87699bbb249f4d3a1e912a59de444d"} Apr 16 14:40:27.574251 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:27.574219 2574 generic.go:358] "Generic (PLEG): container finished" podID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerID="053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577" exitCode=0 Apr 16 14:40:27.574622 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:27.574258 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerDied","Data":"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577"} Apr 16 14:40:27.884154 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:27.884129 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:40:27.907836 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:27.907807 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location\") pod \"5a183179-7ec7-4316-9f6f-69ed5bb59961\" (UID: \"5a183179-7ec7-4316-9f6f-69ed5bb59961\") " Apr 16 14:40:27.908120 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:27.908094 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5a183179-7ec7-4316-9f6f-69ed5bb59961" (UID: "5a183179-7ec7-4316-9f6f-69ed5bb59961"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:40:28.008669 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.008636 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5a183179-7ec7-4316-9f6f-69ed5bb59961-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:40:28.579318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.579281 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerStarted","Data":"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e"} Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.579512 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.580647 2574 generic.go:358] "Generic (PLEG): container finished" podID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerID="aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe" exitCode=0 Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.580694 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.580707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerDied","Data":"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe"} Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.580736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9" event={"ID":"5a183179-7ec7-4316-9f6f-69ed5bb59961","Type":"ContainerDied","Data":"48753f1c71fee0be37042b257ace1c96b600de038dd1686e1a26dbbe91eda054"} Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.580753 2574 scope.go:117] "RemoveContainer" containerID="aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe" Apr 16 14:40:28.589175 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.588758 2574 scope.go:117] "RemoveContainer" containerID="d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35" Apr 16 14:40:28.595926 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.595778 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podStartSLOduration=5.595763861 podStartE2EDuration="5.595763861s" podCreationTimestamp="2026-04-16 14:40:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:40:28.594133026 +0000 UTC m=+2469.955675804" watchObservedRunningTime="2026-04-16 14:40:28.595763861 +0000 UTC m=+2469.957306629" Apr 16 14:40:28.596013 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.595935 2574 scope.go:117] "RemoveContainer" containerID="aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe" Apr 16 14:40:28.596220 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:40:28.596200 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe\": container with ID starting with aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe not found: ID does not exist" containerID="aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe" Apr 16 14:40:28.596270 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.596228 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe"} err="failed to get container status \"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe\": rpc error: code = NotFound desc = could not find container \"aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe\": container with ID starting with aaf1011f34e45ab05560bf437df358da3351418c9739415723ab6a1da1435cbe not found: ID does not exist" Apr 16 14:40:28.596270 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.596246 2574 scope.go:117] "RemoveContainer" containerID="d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35" Apr 16 14:40:28.596467 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:40:28.596436 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35\": container with ID starting with d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35 not found: ID does not exist" containerID="d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35" Apr 16 14:40:28.596808 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.596468 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35"} err="failed to get container status \"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35\": rpc error: code = NotFound desc = could not find container \"d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35\": container with ID starting with d5b2abe4f9db86e5a86e521794a322cccebc09b5b484f116c7a57f7500770c35 not found: ID does not exist" Apr 16 14:40:28.604945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.604926 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:40:28.608654 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:28.608635 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-pf2q9"] Apr 16 14:40:29.187232 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:29.187198 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" path="/var/lib/kubelet/pods/5a183179-7ec7-4316-9f6f-69ed5bb59961/volumes" Apr 16 14:40:59.586165 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:40:59.586129 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:09.584995 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:09.584943 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:19.593945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:19.593837 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:29.584955 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:29.584914 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:39.585014 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:39.584970 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:47.187952 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:47.187919 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:41:53.341722 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:53.341690 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:41:53.342146 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:53.341923 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" containerID="cri-o://fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e" gracePeriod=30 Apr 16 14:41:55.508257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508226 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:41:55.508650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508517 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" Apr 16 14:41:55.508650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508527 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" Apr 16 14:41:55.508650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508544 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="storage-initializer" Apr 16 14:41:55.508650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508550 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="storage-initializer" Apr 16 14:41:55.508650 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.508616 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5a183179-7ec7-4316-9f6f-69ed5bb59961" containerName="kserve-container" Apr 16 14:41:55.511382 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.511364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:41:55.519414 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.519387 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:41:55.605454 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.605419 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location\") pod \"isvc-sklearn-predictor-6bd4b4b959-w9z5m\" (UID: \"f05595a0-b46c-41a5-927b-6770fae11077\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:41:55.706290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.706252 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location\") pod \"isvc-sklearn-predictor-6bd4b4b959-w9z5m\" (UID: \"f05595a0-b46c-41a5-927b-6770fae11077\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:41:55.706676 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.706657 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location\") pod \"isvc-sklearn-predictor-6bd4b4b959-w9z5m\" (UID: \"f05595a0-b46c-41a5-927b-6770fae11077\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:41:55.822372 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.822339 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:41:55.943147 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:55.942967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:41:55.945961 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:41:55.945924 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05595a0_b46c_41a5_927b_6770fae11077.slice/crio-329f54297b3d8441625f93e918a5d7da18a45499466a6ea0b9f6bd3126961ded WatchSource:0}: Error finding container 329f54297b3d8441625f93e918a5d7da18a45499466a6ea0b9f6bd3126961ded: Status 404 returned error can't find the container with id 329f54297b3d8441625f93e918a5d7da18a45499466a6ea0b9f6bd3126961ded Apr 16 14:41:56.839364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:56.839319 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerStarted","Data":"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5"} Apr 16 14:41:56.839364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:56.839367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerStarted","Data":"329f54297b3d8441625f93e918a5d7da18a45499466a6ea0b9f6bd3126961ded"} Apr 16 14:41:57.183904 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:57.183812 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.49:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.133.0.49:8080: connect: connection refused" Apr 16 14:41:58.282218 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.282195 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:41:58.331865 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.331826 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location\") pod \"3484c28b-960a-4e3a-be84-e0270f6409fb\" (UID: \"3484c28b-960a-4e3a-be84-e0270f6409fb\") " Apr 16 14:41:58.332204 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.332177 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3484c28b-960a-4e3a-be84-e0270f6409fb" (UID: "3484c28b-960a-4e3a-be84-e0270f6409fb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:41:58.432571 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.432488 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3484c28b-960a-4e3a-be84-e0270f6409fb-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:41:58.846443 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.846412 2574 generic.go:358] "Generic (PLEG): container finished" podID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerID="fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e" exitCode=0 Apr 16 14:41:58.846608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.846450 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerDied","Data":"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e"} Apr 16 14:41:58.846608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.846474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" event={"ID":"3484c28b-960a-4e3a-be84-e0270f6409fb","Type":"ContainerDied","Data":"6fd3eb8c0bcf2d58dea5b2493d7af77f4b87699bbb249f4d3a1e912a59de444d"} Apr 16 14:41:58.846608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.846480 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5" Apr 16 14:41:58.846608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.846496 2574 scope.go:117] "RemoveContainer" containerID="fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e" Apr 16 14:41:58.854456 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.854437 2574 scope.go:117] "RemoveContainer" containerID="053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577" Apr 16 14:41:58.861435 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.861419 2574 scope.go:117] "RemoveContainer" containerID="fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e" Apr 16 14:41:58.861687 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:41:58.861664 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e\": container with ID starting with fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e not found: ID does not exist" containerID="fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e" Apr 16 14:41:58.861754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.861695 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e"} err="failed to get container status \"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e\": rpc error: code = NotFound desc = could not find container \"fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e\": container with ID starting with fbd71813badc4e7442f6c5cba2ae7fbb02f1e43ee9e2840aca6345d56370251e not found: ID does not exist" Apr 16 14:41:58.861754 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.861713 2574 scope.go:117] "RemoveContainer" containerID="053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577" Apr 16 14:41:58.861910 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:41:58.861893 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577\": container with ID starting with 053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577 not found: ID does not exist" containerID="053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577" Apr 16 14:41:58.861950 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.861917 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577"} err="failed to get container status \"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577\": rpc error: code = NotFound desc = could not find container \"053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577\": container with ID starting with 053d742208f9d799a9d30d682782d67522d6ab7751af4f5776824b826e5a8577 not found: ID does not exist" Apr 16 14:41:58.867621 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.867598 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:41:58.870438 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:58.870412 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jrtc5"] Apr 16 14:41:59.186624 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:59.186539 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" path="/var/lib/kubelet/pods/3484c28b-960a-4e3a-be84-e0270f6409fb/volumes" Apr 16 14:41:59.850244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:59.850211 2574 generic.go:358] "Generic (PLEG): container finished" podID="f05595a0-b46c-41a5-927b-6770fae11077" containerID="2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5" exitCode=0 Apr 16 14:41:59.850672 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:41:59.850285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerDied","Data":"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5"} Apr 16 14:42:00.855388 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:00.855351 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerStarted","Data":"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9"} Apr 16 14:42:00.855815 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:00.855623 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:42:00.856913 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:00.856889 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:00.873903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:00.873855 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podStartSLOduration=5.873842161 podStartE2EDuration="5.873842161s" podCreationTimestamp="2026-04-16 14:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:42:00.871951067 +0000 UTC m=+2562.233493849" watchObservedRunningTime="2026-04-16 14:42:00.873842161 +0000 UTC m=+2562.235384929" Apr 16 14:42:01.858439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:01.858405 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:11.858668 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:11.858575 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:21.858488 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:21.858443 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:31.858994 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:31.858949 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:41.859210 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:41.859121 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:42:51.858712 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:42:51.858667 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:43:01.858786 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:01.858739 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.50:8080: connect: connection refused" Apr 16 14:43:11.860508 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:11.860474 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:43:15.626624 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.626565 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:43:15.627018 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.626836 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" containerID="cri-o://acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9" gracePeriod=30 Apr 16 14:43:15.714278 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714238 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:43:15.714567 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714555 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" Apr 16 14:43:15.714634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714569 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" Apr 16 14:43:15.714634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714591 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="storage-initializer" Apr 16 14:43:15.714634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714597 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="storage-initializer" Apr 16 14:43:15.714732 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.714654 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3484c28b-960a-4e3a-be84-e0270f6409fb" containerName="kserve-container" Apr 16 14:43:15.717545 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.717528 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:15.734262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.734230 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:43:15.834847 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.834805 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-vmdks\" (UID: \"ffa41135-3449-4a9f-a45e-2e247fbefbaa\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:15.936154 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.936059 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-vmdks\" (UID: \"ffa41135-3449-4a9f-a45e-2e247fbefbaa\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:15.936453 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:15.936433 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-vmdks\" (UID: \"ffa41135-3449-4a9f-a45e-2e247fbefbaa\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:16.027322 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:16.027273 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:16.150882 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:16.150855 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:43:16.153751 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:43:16.153721 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffa41135_3449_4a9f_a45e_2e247fbefbaa.slice/crio-5d261bc87f5bdcc7e242c97b9867de0a687c37a1e31c43705c07346857c2383d WatchSource:0}: Error finding container 5d261bc87f5bdcc7e242c97b9867de0a687c37a1e31c43705c07346857c2383d: Status 404 returned error can't find the container with id 5d261bc87f5bdcc7e242c97b9867de0a687c37a1e31c43705c07346857c2383d Apr 16 14:43:16.155485 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:16.155467 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:43:17.080188 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:17.080147 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerStarted","Data":"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5"} Apr 16 14:43:17.080188 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:17.080183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerStarted","Data":"5d261bc87f5bdcc7e242c97b9867de0a687c37a1e31c43705c07346857c2383d"} Apr 16 14:43:20.059231 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.059208 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:43:20.092057 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.092025 2574 generic.go:358] "Generic (PLEG): container finished" podID="f05595a0-b46c-41a5-927b-6770fae11077" containerID="acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9" exitCode=0 Apr 16 14:43:20.092259 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.092091 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" Apr 16 14:43:20.092259 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.092113 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerDied","Data":"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9"} Apr 16 14:43:20.092259 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.092154 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m" event={"ID":"f05595a0-b46c-41a5-927b-6770fae11077","Type":"ContainerDied","Data":"329f54297b3d8441625f93e918a5d7da18a45499466a6ea0b9f6bd3126961ded"} Apr 16 14:43:20.092259 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.092176 2574 scope.go:117] "RemoveContainer" containerID="acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9" Apr 16 14:43:20.093595 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.093560 2574 generic.go:358] "Generic (PLEG): container finished" podID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerID="ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5" exitCode=0 Apr 16 14:43:20.093709 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.093634 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerDied","Data":"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5"} Apr 16 14:43:20.100292 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.100274 2574 scope.go:117] "RemoveContainer" containerID="2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5" Apr 16 14:43:20.107663 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.107642 2574 scope.go:117] "RemoveContainer" containerID="acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9" Apr 16 14:43:20.107967 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:43:20.107939 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9\": container with ID starting with acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9 not found: ID does not exist" containerID="acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9" Apr 16 14:43:20.108060 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.107977 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9"} err="failed to get container status \"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9\": rpc error: code = NotFound desc = could not find container \"acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9\": container with ID starting with acb11f4ecf37dbd24d0ecd140824271d36ac09381a7707b4c309360d010c14b9 not found: ID does not exist" Apr 16 14:43:20.108060 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.108003 2574 scope.go:117] "RemoveContainer" containerID="2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5" Apr 16 14:43:20.108258 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:43:20.108240 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5\": container with ID starting with 2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5 not found: ID does not exist" containerID="2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5" Apr 16 14:43:20.108300 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.108268 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5"} err="failed to get container status \"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5\": rpc error: code = NotFound desc = could not find container \"2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5\": container with ID starting with 2189573cf2d472b98eb1a37bdc6160e0cd02b16990d43b432770d740059840f5 not found: ID does not exist" Apr 16 14:43:20.174885 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.174850 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location\") pod \"f05595a0-b46c-41a5-927b-6770fae11077\" (UID: \"f05595a0-b46c-41a5-927b-6770fae11077\") " Apr 16 14:43:20.175201 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.175177 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f05595a0-b46c-41a5-927b-6770fae11077" (UID: "f05595a0-b46c-41a5-927b-6770fae11077"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:43:20.275685 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.275644 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f05595a0-b46c-41a5-927b-6770fae11077-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:43:20.412872 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.412841 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:43:20.417173 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:20.417146 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-6bd4b4b959-w9z5m"] Apr 16 14:43:21.098059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:21.098025 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerStarted","Data":"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e"} Apr 16 14:43:21.098494 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:21.098257 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:43:21.118564 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:21.118521 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" podStartSLOduration=6.118507041 podStartE2EDuration="6.118507041s" podCreationTimestamp="2026-04-16 14:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:43:21.116151269 +0000 UTC m=+2642.477694036" watchObservedRunningTime="2026-04-16 14:43:21.118507041 +0000 UTC m=+2642.480049809" Apr 16 14:43:21.187154 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:21.187122 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05595a0-b46c-41a5-927b-6770fae11077" path="/var/lib/kubelet/pods/f05595a0-b46c-41a5-927b-6770fae11077/volumes" Apr 16 14:43:52.131511 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:43:52.131456 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 14:44:02.104482 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:02.104440 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:44:05.811950 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.811914 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:44:05.812415 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.812268 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" containerID="cri-o://60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e" gracePeriod=30 Apr 16 14:44:05.875339 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875307 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:05.875622 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875610 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="storage-initializer" Apr 16 14:44:05.875684 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875623 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="storage-initializer" Apr 16 14:44:05.875684 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875642 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" Apr 16 14:44:05.875684 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875648 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" Apr 16 14:44:05.875789 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.875698 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="f05595a0-b46c-41a5-927b-6770fae11077" containerName="kserve-container" Apr 16 14:44:05.878522 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.878506 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:05.886439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.886413 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:05.938593 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:05.938540 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-858dfdc885-hg95b\" (UID: \"72ac5866-e41c-4f5d-b7b7-e57e91cb8627\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:06.039704 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:06.039666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-858dfdc885-hg95b\" (UID: \"72ac5866-e41c-4f5d-b7b7-e57e91cb8627\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:06.040093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:06.040072 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-858dfdc885-hg95b\" (UID: \"72ac5866-e41c-4f5d-b7b7-e57e91cb8627\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:06.189093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:06.189060 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:06.307985 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:06.307961 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:06.310406 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:44:06.310375 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72ac5866_e41c_4f5d_b7b7_e57e91cb8627.slice/crio-a7448a5f7d2932443a738fb51fc1f399cd3569c8af6e7995d8608f0074e65b3c WatchSource:0}: Error finding container a7448a5f7d2932443a738fb51fc1f399cd3569c8af6e7995d8608f0074e65b3c: Status 404 returned error can't find the container with id a7448a5f7d2932443a738fb51fc1f399cd3569c8af6e7995d8608f0074e65b3c Apr 16 14:44:07.233486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:07.233402 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerStarted","Data":"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b"} Apr 16 14:44:07.233486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:07.233436 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerStarted","Data":"a7448a5f7d2932443a738fb51fc1f399cd3569c8af6e7995d8608f0074e65b3c"} Apr 16 14:44:12.102235 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:12.102193 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.51:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.133.0.51:8080: connect: connection refused" Apr 16 14:44:12.249883 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:12.249851 2574 generic.go:358] "Generic (PLEG): container finished" podID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerID="0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b" exitCode=0 Apr 16 14:44:12.250052 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:12.249926 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerDied","Data":"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b"} Apr 16 14:44:13.254944 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.254912 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerStarted","Data":"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302"} Apr 16 14:44:13.255345 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.255210 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:13.256670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.256644 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 14:44:13.271807 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.271708 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" podStartSLOduration=8.271687965 podStartE2EDuration="8.271687965s" podCreationTimestamp="2026-04-16 14:44:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:44:13.271191799 +0000 UTC m=+2694.632734570" watchObservedRunningTime="2026-04-16 14:44:13.271687965 +0000 UTC m=+2694.633230735" Apr 16 14:44:13.571782 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.571760 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:44:13.598513 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.598483 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location\") pod \"ffa41135-3449-4a9f-a45e-2e247fbefbaa\" (UID: \"ffa41135-3449-4a9f-a45e-2e247fbefbaa\") " Apr 16 14:44:13.598798 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.598776 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ffa41135-3449-4a9f-a45e-2e247fbefbaa" (UID: "ffa41135-3449-4a9f-a45e-2e247fbefbaa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:44:13.699899 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:13.699864 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ffa41135-3449-4a9f-a45e-2e247fbefbaa-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:44:14.259905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.259862 2574 generic.go:358] "Generic (PLEG): container finished" podID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerID="60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e" exitCode=0 Apr 16 14:44:14.260375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.259933 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" Apr 16 14:44:14.260375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.259939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerDied","Data":"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e"} Apr 16 14:44:14.260375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.259977 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks" event={"ID":"ffa41135-3449-4a9f-a45e-2e247fbefbaa","Type":"ContainerDied","Data":"5d261bc87f5bdcc7e242c97b9867de0a687c37a1e31c43705c07346857c2383d"} Apr 16 14:44:14.260375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.259991 2574 scope.go:117] "RemoveContainer" containerID="60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e" Apr 16 14:44:14.260637 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.260511 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 14:44:14.269639 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.269618 2574 scope.go:117] "RemoveContainer" containerID="ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5" Apr 16 14:44:14.276656 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.276634 2574 scope.go:117] "RemoveContainer" containerID="60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e" Apr 16 14:44:14.276938 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:44:14.276914 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e\": container with ID starting with 60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e not found: ID does not exist" containerID="60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e" Apr 16 14:44:14.277010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.276947 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e"} err="failed to get container status \"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e\": rpc error: code = NotFound desc = could not find container \"60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e\": container with ID starting with 60b166ade398dac6b4eb7e847c55896232447f5ae1ec3415758c53de9789b67e not found: ID does not exist" Apr 16 14:44:14.277010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.276966 2574 scope.go:117] "RemoveContainer" containerID="ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5" Apr 16 14:44:14.277211 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:44:14.277196 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5\": container with ID starting with ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5 not found: ID does not exist" containerID="ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5" Apr 16 14:44:14.277253 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.277217 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5"} err="failed to get container status \"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5\": rpc error: code = NotFound desc = could not find container \"ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5\": container with ID starting with ddab31301ce73721a7ad7067da2b8c76e75cf9bb75d8584e1062a214482814c5 not found: ID does not exist" Apr 16 14:44:14.281852 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.281827 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:44:14.286432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:14.286406 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-vmdks"] Apr 16 14:44:15.187705 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:15.187674 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" path="/var/lib/kubelet/pods/ffa41135-3449-4a9f-a45e-2e247fbefbaa/volumes" Apr 16 14:44:24.261113 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:24.261068 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.52:8080: connect: connection refused" Apr 16 14:44:34.262176 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:34.262145 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:42.218327 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:42.218290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:44:42.220969 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:42.220944 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:44:42.868246 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:42.868217 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-858dfdc885-hg95b_72ac5866-e41c-4f5d-b7b7-e57e91cb8627/kserve-container/0.log" Apr 16 14:44:43.002848 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.002807 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:43.003219 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.003177 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" containerID="cri-o://cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302" gracePeriod=30 Apr 16 14:44:43.070796 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.070764 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:44:43.071081 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.071070 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" Apr 16 14:44:43.071135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.071083 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" Apr 16 14:44:43.071135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.071095 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="storage-initializer" Apr 16 14:44:43.071135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.071101 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="storage-initializer" Apr 16 14:44:43.071263 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.071148 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ffa41135-3449-4a9f-a45e-2e247fbefbaa" containerName="kserve-container" Apr 16 14:44:43.074135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.074119 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:43.082347 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.082324 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:44:43.160509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.160421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8\" (UID: \"4be73c5b-6864-4eaf-a1c8-c835652d62ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:43.261000 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.260959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8\" (UID: \"4be73c5b-6864-4eaf-a1c8-c835652d62ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:43.261383 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.261332 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8\" (UID: \"4be73c5b-6864-4eaf-a1c8-c835652d62ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:43.384830 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.384799 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:43.508059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:43.508032 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:44:43.510634 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:44:43.510561 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be73c5b_6864_4eaf_a1c8_c835652d62ec.slice/crio-d01532b7fc3329cbf63e86703b912d9a9fc16f6c884e11063907f3eff71d9f75 WatchSource:0}: Error finding container d01532b7fc3329cbf63e86703b912d9a9fc16f6c884e11063907f3eff71d9f75: Status 404 returned error can't find the container with id d01532b7fc3329cbf63e86703b912d9a9fc16f6c884e11063907f3eff71d9f75 Apr 16 14:44:44.357775 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:44.357736 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerStarted","Data":"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e"} Apr 16 14:44:44.357775 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:44.357775 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerStarted","Data":"d01532b7fc3329cbf63e86703b912d9a9fc16f6c884e11063907f3eff71d9f75"} Apr 16 14:44:47.249832 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.249807 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:47.372045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.371952 2574 generic.go:358] "Generic (PLEG): container finished" podID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerID="62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e" exitCode=0 Apr 16 14:44:47.372045 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.372002 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerDied","Data":"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e"} Apr 16 14:44:47.373634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.373605 2574 generic.go:358] "Generic (PLEG): container finished" podID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerID="cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302" exitCode=0 Apr 16 14:44:47.373749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.373657 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" Apr 16 14:44:47.373749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.373682 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerDied","Data":"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302"} Apr 16 14:44:47.373749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.373716 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b" event={"ID":"72ac5866-e41c-4f5d-b7b7-e57e91cb8627","Type":"ContainerDied","Data":"a7448a5f7d2932443a738fb51fc1f399cd3569c8af6e7995d8608f0074e65b3c"} Apr 16 14:44:47.373749 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.373731 2574 scope.go:117] "RemoveContainer" containerID="cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302" Apr 16 14:44:47.381764 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.381747 2574 scope.go:117] "RemoveContainer" containerID="0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b" Apr 16 14:44:47.389295 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.389273 2574 scope.go:117] "RemoveContainer" containerID="cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302" Apr 16 14:44:47.389610 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:44:47.389573 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302\": container with ID starting with cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302 not found: ID does not exist" containerID="cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302" Apr 16 14:44:47.389666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.389624 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302"} err="failed to get container status \"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302\": rpc error: code = NotFound desc = could not find container \"cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302\": container with ID starting with cf0bdcf57a8f0fdb9e18c7abf6dd8580ce11355f5cf901068b15cf66c6005302 not found: ID does not exist" Apr 16 14:44:47.389666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.389650 2574 scope.go:117] "RemoveContainer" containerID="0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b" Apr 16 14:44:47.392763 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:44:47.392735 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b\": container with ID starting with 0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b not found: ID does not exist" containerID="0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b" Apr 16 14:44:47.392931 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.392772 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b"} err="failed to get container status \"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b\": rpc error: code = NotFound desc = could not find container \"0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b\": container with ID starting with 0c1cafdf47f8ddf29cc2685728d2f06909034791ff269a3726db3ef7b8b5f30b not found: ID does not exist" Apr 16 14:44:47.394367 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.394348 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location\") pod \"72ac5866-e41c-4f5d-b7b7-e57e91cb8627\" (UID: \"72ac5866-e41c-4f5d-b7b7-e57e91cb8627\") " Apr 16 14:44:47.421841 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.421805 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72ac5866-e41c-4f5d-b7b7-e57e91cb8627" (UID: "72ac5866-e41c-4f5d-b7b7-e57e91cb8627"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:44:47.495177 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.495144 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72ac5866-e41c-4f5d-b7b7-e57e91cb8627-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:44:47.699241 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.699207 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:47.705129 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:47.705101 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-858dfdc885-hg95b"] Apr 16 14:44:48.378847 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:48.378807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerStarted","Data":"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd"} Apr 16 14:44:48.379299 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:48.379047 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:44:48.398346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:48.398299 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" podStartSLOduration=5.398284547 podStartE2EDuration="5.398284547s" podCreationTimestamp="2026-04-16 14:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:44:48.396075857 +0000 UTC m=+2729.757618625" watchObservedRunningTime="2026-04-16 14:44:48.398284547 +0000 UTC m=+2729.759827395" Apr 16 14:44:49.187560 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:44:49.187521 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" path="/var/lib/kubelet/pods/72ac5866-e41c-4f5d-b7b7-e57e91cb8627/volumes" Apr 16 14:45:19.431365 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:19.431317 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 14:45:29.385161 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:29.385128 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:45:33.152214 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.152171 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:45:33.152707 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.152473 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" containerID="cri-o://be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd" gracePeriod=30 Apr 16 14:45:33.233644 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.233611 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:45:33.233994 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.233981 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="storage-initializer" Apr 16 14:45:33.234039 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.233997 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="storage-initializer" Apr 16 14:45:33.234039 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.234020 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" Apr 16 14:45:33.234039 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.234029 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" Apr 16 14:45:33.234137 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.234083 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="72ac5866-e41c-4f5d-b7b7-e57e91cb8627" containerName="kserve-container" Apr 16 14:45:33.237109 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.237089 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:33.246634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.246611 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:45:33.367523 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.367478 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-778988494c-dtx54\" (UID: \"69bd7774-1028-468c-893e-de2d1e5c03c1\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:33.468794 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.468705 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-778988494c-dtx54\" (UID: \"69bd7774-1028-468c-893e-de2d1e5c03c1\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:33.469116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.469096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-778988494c-dtx54\" (UID: \"69bd7774-1028-468c-893e-de2d1e5c03c1\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:33.546944 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.546908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:33.666126 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:33.666100 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:45:33.668724 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:45:33.668697 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69bd7774_1028_468c_893e_de2d1e5c03c1.slice/crio-8b3d08865855b1ab0019cd54e233af87fcea84617fcff1074d80813092988097 WatchSource:0}: Error finding container 8b3d08865855b1ab0019cd54e233af87fcea84617fcff1074d80813092988097: Status 404 returned error can't find the container with id 8b3d08865855b1ab0019cd54e233af87fcea84617fcff1074d80813092988097 Apr 16 14:45:34.518104 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:34.518064 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerStarted","Data":"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5"} Apr 16 14:45:34.518104 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:34.518103 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerStarted","Data":"8b3d08865855b1ab0019cd54e233af87fcea84617fcff1074d80813092988097"} Apr 16 14:45:37.528667 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:37.528571 2574 generic.go:358] "Generic (PLEG): container finished" podID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerID="b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5" exitCode=0 Apr 16 14:45:37.528667 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:37.528640 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerDied","Data":"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5"} Apr 16 14:45:38.534648 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:38.534552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerStarted","Data":"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e"} Apr 16 14:45:38.535062 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:38.534916 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:45:38.536191 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:38.536164 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:45:38.551436 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:38.551386 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podStartSLOduration=5.551371181 podStartE2EDuration="5.551371181s" podCreationTimestamp="2026-04-16 14:45:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:45:38.550003112 +0000 UTC m=+2779.911545913" watchObservedRunningTime="2026-04-16 14:45:38.551371181 +0000 UTC m=+2779.912913948" Apr 16 14:45:39.383369 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:39.383321 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.53:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.133.0.53:8080: connect: connection refused" Apr 16 14:45:39.538182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:39.538137 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:45:40.786482 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:40.786458 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:45:40.935053 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:40.934957 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location\") pod \"4be73c5b-6864-4eaf-a1c8-c835652d62ec\" (UID: \"4be73c5b-6864-4eaf-a1c8-c835652d62ec\") " Apr 16 14:45:40.935257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:40.935232 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4be73c5b-6864-4eaf-a1c8-c835652d62ec" (UID: "4be73c5b-6864-4eaf-a1c8-c835652d62ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:45:41.035723 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.035688 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4be73c5b-6864-4eaf-a1c8-c835652d62ec-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:45:41.544282 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.544197 2574 generic.go:358] "Generic (PLEG): container finished" podID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerID="be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd" exitCode=0 Apr 16 14:45:41.544432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.544276 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerDied","Data":"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd"} Apr 16 14:45:41.544432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.544290 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" Apr 16 14:45:41.544432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.544317 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8" event={"ID":"4be73c5b-6864-4eaf-a1c8-c835652d62ec","Type":"ContainerDied","Data":"d01532b7fc3329cbf63e86703b912d9a9fc16f6c884e11063907f3eff71d9f75"} Apr 16 14:45:41.544432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.544343 2574 scope.go:117] "RemoveContainer" containerID="be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd" Apr 16 14:45:41.552298 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.552280 2574 scope.go:117] "RemoveContainer" containerID="62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e" Apr 16 14:45:41.559491 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.559470 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:45:41.560383 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.560361 2574 scope.go:117] "RemoveContainer" containerID="be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd" Apr 16 14:45:41.560673 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:45:41.560652 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd\": container with ID starting with be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd not found: ID does not exist" containerID="be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd" Apr 16 14:45:41.560771 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.560680 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd"} err="failed to get container status \"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd\": rpc error: code = NotFound desc = could not find container \"be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd\": container with ID starting with be9bf89de9816547afddcc8d686799c016dd932d5f5bcedbe29b757ad71079fd not found: ID does not exist" Apr 16 14:45:41.560771 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.560699 2574 scope.go:117] "RemoveContainer" containerID="62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e" Apr 16 14:45:41.560929 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:45:41.560911 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e\": container with ID starting with 62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e not found: ID does not exist" containerID="62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e" Apr 16 14:45:41.560970 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.560936 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e"} err="failed to get container status \"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e\": rpc error: code = NotFound desc = could not find container \"62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e\": container with ID starting with 62f735ec7710253e66a63c6b943484421bddce6537601382b11e8340ac9a557e not found: ID does not exist" Apr 16 14:45:41.561167 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:41.561152 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-x9qn8"] Apr 16 14:45:43.187155 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:43.187124 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" path="/var/lib/kubelet/pods/4be73c5b-6864-4eaf-a1c8-c835652d62ec/volumes" Apr 16 14:45:49.539067 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:49.539022 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:45:59.538647 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:45:59.538607 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:46:09.539019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:09.538973 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:46:19.538259 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:19.538209 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:46:29.538875 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:29.538835 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:46:39.538477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:39.538433 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.54:8080: connect: connection refused" Apr 16 14:46:49.539610 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:49.539566 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:46:53.542532 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542493 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:46:53.543010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542799 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="storage-initializer" Apr 16 14:46:53.543010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542811 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="storage-initializer" Apr 16 14:46:53.543010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542826 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" Apr 16 14:46:53.543010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542831 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" Apr 16 14:46:53.543010 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.542892 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4be73c5b-6864-4eaf-a1c8-c835652d62ec" containerName="kserve-container" Apr 16 14:46:53.545900 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.545879 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:53.554547 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.554522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:46:53.589699 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.589664 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:46:53.589958 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.589919 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" containerID="cri-o://d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e" gracePeriod=30 Apr 16 14:46:53.723740 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.723698 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj\" (UID: \"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:53.824355 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.824316 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj\" (UID: \"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:53.824727 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.824708 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj\" (UID: \"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:53.856167 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.856132 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:53.977466 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:53.977434 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:46:53.980492 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:46:53.980465 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0d29c38_4e0b_4ae7_89bc_784cd8b8ea3c.slice/crio-b23f9fd29c4d4a1157185d5a26c9de77d662fe93b10cba94ad170165f4886cff WatchSource:0}: Error finding container b23f9fd29c4d4a1157185d5a26c9de77d662fe93b10cba94ad170165f4886cff: Status 404 returned error can't find the container with id b23f9fd29c4d4a1157185d5a26c9de77d662fe93b10cba94ad170165f4886cff Apr 16 14:46:54.760743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:54.760707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerStarted","Data":"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab"} Apr 16 14:46:54.760743 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:54.760745 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerStarted","Data":"b23f9fd29c4d4a1157185d5a26c9de77d662fe93b10cba94ad170165f4886cff"} Apr 16 14:46:58.035897 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.035874 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:46:58.059135 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.059057 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location\") pod \"69bd7774-1028-468c-893e-de2d1e5c03c1\" (UID: \"69bd7774-1028-468c-893e-de2d1e5c03c1\") " Apr 16 14:46:58.059385 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.059364 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "69bd7774-1028-468c-893e-de2d1e5c03c1" (UID: "69bd7774-1028-468c-893e-de2d1e5c03c1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:46:58.160495 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.160454 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/69bd7774-1028-468c-893e-de2d1e5c03c1-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:46:58.774056 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.774024 2574 generic.go:358] "Generic (PLEG): container finished" podID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerID="d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e" exitCode=0 Apr 16 14:46:58.774260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.774082 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerDied","Data":"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e"} Apr 16 14:46:58.774260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.774101 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" Apr 16 14:46:58.774260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.774125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54" event={"ID":"69bd7774-1028-468c-893e-de2d1e5c03c1","Type":"ContainerDied","Data":"8b3d08865855b1ab0019cd54e233af87fcea84617fcff1074d80813092988097"} Apr 16 14:46:58.774260 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.774146 2574 scope.go:117] "RemoveContainer" containerID="d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e" Apr 16 14:46:58.775570 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.775552 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerID="5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab" exitCode=0 Apr 16 14:46:58.775678 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.775616 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerDied","Data":"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab"} Apr 16 14:46:58.782181 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.782153 2574 scope.go:117] "RemoveContainer" containerID="b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5" Apr 16 14:46:58.789548 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.789530 2574 scope.go:117] "RemoveContainer" containerID="d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e" Apr 16 14:46:58.789840 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:46:58.789822 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e\": container with ID starting with d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e not found: ID does not exist" containerID="d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e" Apr 16 14:46:58.789888 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.789850 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e"} err="failed to get container status \"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e\": rpc error: code = NotFound desc = could not find container \"d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e\": container with ID starting with d2387b71e8e5e4940f22194041276cbc14c1eed8fd60e502d9601c35ef24655e not found: ID does not exist" Apr 16 14:46:58.789888 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.789872 2574 scope.go:117] "RemoveContainer" containerID="b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5" Apr 16 14:46:58.790130 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:46:58.790110 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5\": container with ID starting with b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5 not found: ID does not exist" containerID="b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5" Apr 16 14:46:58.790210 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.790135 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5"} err="failed to get container status \"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5\": rpc error: code = NotFound desc = could not find container \"b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5\": container with ID starting with b716b7e7091514527107b4daf6acaaeba6c9082eb4e298aed633f3744ce4d3b5 not found: ID does not exist" Apr 16 14:46:58.805630 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.805598 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:46:58.809972 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:58.809948 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-778988494c-dtx54"] Apr 16 14:46:59.188540 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:59.188498 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" path="/var/lib/kubelet/pods/69bd7774-1028-468c-893e-de2d1e5c03c1/volumes" Apr 16 14:46:59.780373 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:59.780342 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerStarted","Data":"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4"} Apr 16 14:46:59.780635 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:59.780615 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:46:59.781985 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:59.781958 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:46:59.797346 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:46:59.797299 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podStartSLOduration=6.797285181 podStartE2EDuration="6.797285181s" podCreationTimestamp="2026-04-16 14:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:46:59.795935014 +0000 UTC m=+2861.157477785" watchObservedRunningTime="2026-04-16 14:46:59.797285181 +0000 UTC m=+2861.158828012" Apr 16 14:47:00.784125 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:00.784085 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:47:10.784542 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:10.784496 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:47:20.785079 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:20.785035 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:47:30.784492 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:30.784419 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:47:40.784823 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:40.784778 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:47:50.784887 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:47:50.784845 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:48:00.784641 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:00.784563 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:48:05.190563 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:05.190530 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:48:13.662511 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.662475 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:48:13.663015 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.662726 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" containerID="cri-o://9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4" gracePeriod=30 Apr 16 14:48:13.728571 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728532 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:48:13.728858 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728845 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" Apr 16 14:48:13.728905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728860 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" Apr 16 14:48:13.728905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728871 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="storage-initializer" Apr 16 14:48:13.728905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728877 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="storage-initializer" Apr 16 14:48:13.729024 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.728924 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="69bd7774-1028-468c-893e-de2d1e5c03c1" containerName="kserve-container" Apr 16 14:48:13.733603 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.732890 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:13.740701 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.740669 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:48:13.749689 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.749652 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-lhz29\" (UID: \"253bf1ff-605c-4264-a68d-2b1259516120\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:13.850083 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.850047 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-lhz29\" (UID: \"253bf1ff-605c-4264-a68d-2b1259516120\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:13.850404 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:13.850386 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-lhz29\" (UID: \"253bf1ff-605c-4264-a68d-2b1259516120\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:14.046446 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:14.046368 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:14.166382 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:14.166352 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:48:14.168608 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:48:14.168565 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod253bf1ff_605c_4264_a68d_2b1259516120.slice/crio-60bec839236df1e3aa24c8a70391dcdc0203de3e92401f01b85178c14d55302b WatchSource:0}: Error finding container 60bec839236df1e3aa24c8a70391dcdc0203de3e92401f01b85178c14d55302b: Status 404 returned error can't find the container with id 60bec839236df1e3aa24c8a70391dcdc0203de3e92401f01b85178c14d55302b Apr 16 14:48:14.999392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:14.999357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerStarted","Data":"5d8f4f635e3351d2f80566dc6744604719f5535b003184691088bc49f7f991fd"} Apr 16 14:48:14.999392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:14.999392 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerStarted","Data":"60bec839236df1e3aa24c8a70391dcdc0203de3e92401f01b85178c14d55302b"} Apr 16 14:48:15.183909 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:15.183855 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.55:8080: connect: connection refused" Apr 16 14:48:18.220094 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:18.220070 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:48:18.282004 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:18.281971 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location\") pod \"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c\" (UID: \"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c\") " Apr 16 14:48:18.282336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:18.282313 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" (UID: "b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:48:18.382844 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:18.382810 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:48:19.010459 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.010425 2574 generic.go:358] "Generic (PLEG): container finished" podID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerID="9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4" exitCode=0 Apr 16 14:48:19.010687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.010491 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" Apr 16 14:48:19.010687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.010502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerDied","Data":"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4"} Apr 16 14:48:19.010687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.010534 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj" event={"ID":"b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c","Type":"ContainerDied","Data":"b23f9fd29c4d4a1157185d5a26c9de77d662fe93b10cba94ad170165f4886cff"} Apr 16 14:48:19.010687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.010554 2574 scope.go:117] "RemoveContainer" containerID="9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4" Apr 16 14:48:19.011838 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.011814 2574 generic.go:358] "Generic (PLEG): container finished" podID="253bf1ff-605c-4264-a68d-2b1259516120" containerID="5d8f4f635e3351d2f80566dc6744604719f5535b003184691088bc49f7f991fd" exitCode=0 Apr 16 14:48:19.011957 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.011895 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerDied","Data":"5d8f4f635e3351d2f80566dc6744604719f5535b003184691088bc49f7f991fd"} Apr 16 14:48:19.013253 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.013235 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:48:19.019808 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.019696 2574 scope.go:117] "RemoveContainer" containerID="5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab" Apr 16 14:48:19.027131 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.027112 2574 scope.go:117] "RemoveContainer" containerID="9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4" Apr 16 14:48:19.027389 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:48:19.027369 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4\": container with ID starting with 9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4 not found: ID does not exist" containerID="9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4" Apr 16 14:48:19.027460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.027395 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4"} err="failed to get container status \"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4\": rpc error: code = NotFound desc = could not find container \"9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4\": container with ID starting with 9a0b6b845e18fecb2c4cd27f840c86981b1abc5c08cad81d9dc72a13c27c08d4 not found: ID does not exist" Apr 16 14:48:19.027460 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.027417 2574 scope.go:117] "RemoveContainer" containerID="5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab" Apr 16 14:48:19.028122 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:48:19.028095 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab\": container with ID starting with 5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab not found: ID does not exist" containerID="5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab" Apr 16 14:48:19.028196 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.028134 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab"} err="failed to get container status \"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab\": rpc error: code = NotFound desc = could not find container \"5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab\": container with ID starting with 5ad3ab4b674516b60fbb3ce92ca5ce57d4f9164d07f70a474b6fa978fd7e6bab not found: ID does not exist" Apr 16 14:48:19.040680 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.040654 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:48:19.044617 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.044596 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-778d8cbb88-755vj"] Apr 16 14:48:19.191224 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:19.191192 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" path="/var/lib/kubelet/pods/b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c/volumes" Apr 16 14:48:23.026397 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:23.026367 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerStarted","Data":"da6ed249d4a0fdb8e8e5b41f3d631797f9c579b9316b8b48157df5049b82e847"} Apr 16 14:48:23.026893 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:23.026617 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:48:23.028141 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:23.028115 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 14:48:23.042859 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:23.042809 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" podStartSLOduration=6.160982939 podStartE2EDuration="10.042793329s" podCreationTimestamp="2026-04-16 14:48:13 +0000 UTC" firstStartedPulling="2026-04-16 14:48:19.013358929 +0000 UTC m=+2940.374901679" lastFinishedPulling="2026-04-16 14:48:22.895169312 +0000 UTC m=+2944.256712069" observedRunningTime="2026-04-16 14:48:23.041352554 +0000 UTC m=+2944.402895322" watchObservedRunningTime="2026-04-16 14:48:23.042793329 +0000 UTC m=+2944.404336098" Apr 16 14:48:24.030306 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:24.030269 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 14:48:34.031116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:34.031051 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.56:8080: connect: connection refused" Apr 16 14:48:44.031981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:48:44.031902 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:49:04.777261 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.777225 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:49:04.777796 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.777508 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" containerID="cri-o://da6ed249d4a0fdb8e8e5b41f3d631797f9c579b9316b8b48157df5049b82e847" gracePeriod=30 Apr 16 14:49:04.846015 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.845976 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:49:04.846287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.846275 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="storage-initializer" Apr 16 14:49:04.846336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.846288 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="storage-initializer" Apr 16 14:49:04.846336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.846311 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" Apr 16 14:49:04.846336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.846317 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" Apr 16 14:49:04.846432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.846362 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0d29c38-4e0b-4ae7-89bc-784cd8b8ea3c" containerName="kserve-container" Apr 16 14:49:04.849172 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.849149 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:04.855709 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.855677 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:49:04.961242 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:04.961202 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb\" (UID: \"66291509-d0a8-476d-97e6-c0d3eccfe25a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:05.062093 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:05.062005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb\" (UID: \"66291509-d0a8-476d-97e6-c0d3eccfe25a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:05.062425 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:05.062400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb\" (UID: \"66291509-d0a8-476d-97e6-c0d3eccfe25a\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:05.160870 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:05.160836 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:05.280258 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:05.280078 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:49:05.283078 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:49:05.283049 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod66291509_d0a8_476d_97e6_c0d3eccfe25a.slice/crio-5f991cfa4c1f92b5dfce7caf52b2fd1bd5cb1288df63429e6e9201aee41106c3 WatchSource:0}: Error finding container 5f991cfa4c1f92b5dfce7caf52b2fd1bd5cb1288df63429e6e9201aee41106c3: Status 404 returned error can't find the container with id 5f991cfa4c1f92b5dfce7caf52b2fd1bd5cb1288df63429e6e9201aee41106c3 Apr 16 14:49:06.150230 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:06.150191 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerStarted","Data":"a3cd5426677f7b219307b66f117a4fff52e22991130faf830baba5a34574113e"} Apr 16 14:49:06.150230 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:06.150233 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerStarted","Data":"5f991cfa4c1f92b5dfce7caf52b2fd1bd5cb1288df63429e6e9201aee41106c3"} Apr 16 14:49:10.164176 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:10.164140 2574 generic.go:358] "Generic (PLEG): container finished" podID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerID="a3cd5426677f7b219307b66f117a4fff52e22991130faf830baba5a34574113e" exitCode=0 Apr 16 14:49:10.164557 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:10.164214 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerDied","Data":"a3cd5426677f7b219307b66f117a4fff52e22991130faf830baba5a34574113e"} Apr 16 14:49:11.168148 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:11.168114 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerStarted","Data":"26b6bff00f2254be5a0258c5aa9fe761616da517889783477b40a07b166f6101"} Apr 16 14:49:11.168560 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:11.168405 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:11.169738 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:11.169711 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 14:49:11.185122 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:11.185073 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" podStartSLOduration=7.185056472 podStartE2EDuration="7.185056472s" podCreationTimestamp="2026-04-16 14:49:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:49:11.184163085 +0000 UTC m=+2992.545705863" watchObservedRunningTime="2026-04-16 14:49:11.185056472 +0000 UTC m=+2992.546599239" Apr 16 14:49:12.173997 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:12.173961 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.57:8080: connect: connection refused" Apr 16 14:49:22.175512 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:22.175480 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:49:35.244310 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.244275 2574 generic.go:358] "Generic (PLEG): container finished" podID="253bf1ff-605c-4264-a68d-2b1259516120" containerID="da6ed249d4a0fdb8e8e5b41f3d631797f9c579b9316b8b48157df5049b82e847" exitCode=137 Apr 16 14:49:35.244716 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.244331 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerDied","Data":"da6ed249d4a0fdb8e8e5b41f3d631797f9c579b9316b8b48157df5049b82e847"} Apr 16 14:49:35.415977 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.415951 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:49:35.514608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.514482 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location\") pod \"253bf1ff-605c-4264-a68d-2b1259516120\" (UID: \"253bf1ff-605c-4264-a68d-2b1259516120\") " Apr 16 14:49:35.525431 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.525385 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "253bf1ff-605c-4264-a68d-2b1259516120" (UID: "253bf1ff-605c-4264-a68d-2b1259516120"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:49:35.615376 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:35.615341 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/253bf1ff-605c-4264-a68d-2b1259516120-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:49:36.086826 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.086790 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:49:36.087178 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.087153 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" containerID="cri-o://26b6bff00f2254be5a0258c5aa9fe761616da517889783477b40a07b166f6101" gracePeriod=30 Apr 16 14:49:36.207018 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.206980 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:49:36.207310 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.207297 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="storage-initializer" Apr 16 14:49:36.207364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.207313 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="storage-initializer" Apr 16 14:49:36.207364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.207336 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" Apr 16 14:49:36.207364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.207344 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" Apr 16 14:49:36.207483 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.207406 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="253bf1ff-605c-4264-a68d-2b1259516120" containerName="kserve-container" Apr 16 14:49:36.209822 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.209803 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:49:36.219937 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.219915 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:49:36.248389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.248364 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" Apr 16 14:49:36.248864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.248359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29" event={"ID":"253bf1ff-605c-4264-a68d-2b1259516120","Type":"ContainerDied","Data":"60bec839236df1e3aa24c8a70391dcdc0203de3e92401f01b85178c14d55302b"} Apr 16 14:49:36.248864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.248487 2574 scope.go:117] "RemoveContainer" containerID="da6ed249d4a0fdb8e8e5b41f3d631797f9c579b9316b8b48157df5049b82e847" Apr 16 14:49:36.256651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.256628 2574 scope.go:117] "RemoveContainer" containerID="5d8f4f635e3351d2f80566dc6744604719f5535b003184691088bc49f7f991fd" Apr 16 14:49:36.268399 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.268373 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:49:36.274547 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.274526 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-lhz29"] Apr 16 14:49:36.320496 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.320465 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ddwrf\" (UID: \"c1b260d6-d01f-443a-8d45-fd16cc366765\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:49:36.421196 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.421098 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ddwrf\" (UID: \"c1b260d6-d01f-443a-8d45-fd16cc366765\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:49:36.421510 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.421489 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ddwrf\" (UID: \"c1b260d6-d01f-443a-8d45-fd16cc366765\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:49:36.520424 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.520383 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:49:36.641469 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:36.641444 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:49:36.643224 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:49:36.643195 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1b260d6_d01f_443a_8d45_fd16cc366765.slice/crio-a0e3fe1b57371eb5c6dd4e2b19d74db94cf0a173e3f2c657ee2ae4de5367ce4d WatchSource:0}: Error finding container a0e3fe1b57371eb5c6dd4e2b19d74db94cf0a173e3f2c657ee2ae4de5367ce4d: Status 404 returned error can't find the container with id a0e3fe1b57371eb5c6dd4e2b19d74db94cf0a173e3f2c657ee2ae4de5367ce4d Apr 16 14:49:37.187344 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:37.187310 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253bf1ff-605c-4264-a68d-2b1259516120" path="/var/lib/kubelet/pods/253bf1ff-605c-4264-a68d-2b1259516120/volumes" Apr 16 14:49:37.252864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:37.252823 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerStarted","Data":"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f"} Apr 16 14:49:37.252864 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:37.252870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerStarted","Data":"a0e3fe1b57371eb5c6dd4e2b19d74db94cf0a173e3f2c657ee2ae4de5367ce4d"} Apr 16 14:49:41.266546 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:41.266513 2574 generic.go:358] "Generic (PLEG): container finished" podID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerID="548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f" exitCode=0 Apr 16 14:49:41.266943 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:41.266560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerDied","Data":"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f"} Apr 16 14:49:42.258273 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:42.258239 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:49:42.269850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:49:42.269819 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:50:06.389984 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:06.389838 2574 generic.go:358] "Generic (PLEG): container finished" podID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerID="26b6bff00f2254be5a0258c5aa9fe761616da517889783477b40a07b166f6101" exitCode=137 Apr 16 14:50:06.389984 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:06.389939 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerDied","Data":"26b6bff00f2254be5a0258c5aa9fe761616da517889783477b40a07b166f6101"} Apr 16 14:50:06.781921 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:06.781885 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:50:06.904194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:06.904162 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location\") pod \"66291509-d0a8-476d-97e6-c0d3eccfe25a\" (UID: \"66291509-d0a8-476d-97e6-c0d3eccfe25a\") " Apr 16 14:50:06.912038 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:06.911991 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "66291509-d0a8-476d-97e6-c0d3eccfe25a" (UID: "66291509-d0a8-476d-97e6-c0d3eccfe25a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:50:07.006035 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.005788 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/66291509-d0a8-476d-97e6-c0d3eccfe25a-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:50:07.394955 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.394911 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" event={"ID":"66291509-d0a8-476d-97e6-c0d3eccfe25a","Type":"ContainerDied","Data":"5f991cfa4c1f92b5dfce7caf52b2fd1bd5cb1288df63429e6e9201aee41106c3"} Apr 16 14:50:07.394955 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.394961 2574 scope.go:117] "RemoveContainer" containerID="26b6bff00f2254be5a0258c5aa9fe761616da517889783477b40a07b166f6101" Apr 16 14:50:07.395502 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.395067 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb" Apr 16 14:50:07.405651 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.405630 2574 scope.go:117] "RemoveContainer" containerID="a3cd5426677f7b219307b66f117a4fff52e22991130faf830baba5a34574113e" Apr 16 14:50:07.413503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.413455 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:50:07.415471 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:07.415449 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-jl2tb"] Apr 16 14:50:09.188459 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:50:09.188421 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" path="/var/lib/kubelet/pods/66291509-d0a8-476d-97e6-c0d3eccfe25a/volumes" Apr 16 14:51:36.685991 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:36.685956 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerStarted","Data":"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394"} Apr 16 14:51:36.686386 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:36.686147 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:51:36.687414 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:36.687388 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 14:51:36.705503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:36.705448 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" podStartSLOduration=5.795870795 podStartE2EDuration="2m0.705428267s" podCreationTimestamp="2026-04-16 14:49:36 +0000 UTC" firstStartedPulling="2026-04-16 14:49:41.267530471 +0000 UTC m=+3022.629073217" lastFinishedPulling="2026-04-16 14:51:36.17708794 +0000 UTC m=+3137.538630689" observedRunningTime="2026-04-16 14:51:36.703534141 +0000 UTC m=+3138.065076909" watchObservedRunningTime="2026-04-16 14:51:36.705428267 +0000 UTC m=+3138.066971037" Apr 16 14:51:37.689486 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:37.689448 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.58:8080: connect: connection refused" Apr 16 14:51:47.691144 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:47.691114 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:51:57.745439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.745397 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:51:57.745981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.745739 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" containerID="cri-o://c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394" gracePeriod=30 Apr 16 14:51:57.809287 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809251 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:51:57.809612 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809599 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="storage-initializer" Apr 16 14:51:57.809662 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809614 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="storage-initializer" Apr 16 14:51:57.809662 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809625 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" Apr 16 14:51:57.809662 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809631 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" Apr 16 14:51:57.809764 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.809677 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="66291509-d0a8-476d-97e6-c0d3eccfe25a" containerName="kserve-container" Apr 16 14:51:57.812568 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.812550 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:51:57.820834 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.820803 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:51:57.882319 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.882283 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-dlvst\" (UID: \"bff50c9e-a3d3-4717-9e41-e30fbcb787b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:51:57.983613 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.983485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-dlvst\" (UID: \"bff50c9e-a3d3-4717-9e41-e30fbcb787b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:51:57.983902 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:57.983881 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-dlvst\" (UID: \"bff50c9e-a3d3-4717-9e41-e30fbcb787b7\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:51:58.123665 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:58.123625 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:51:58.277826 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:58.277730 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:51:58.281682 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:51:58.281649 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbff50c9e_a3d3_4717_9e41_e30fbcb787b7.slice/crio-046b6786122f82f609ed2e7d25ee9798e6b10650bccb60df7ae498f5d1fa08c1 WatchSource:0}: Error finding container 046b6786122f82f609ed2e7d25ee9798e6b10650bccb60df7ae498f5d1fa08c1: Status 404 returned error can't find the container with id 046b6786122f82f609ed2e7d25ee9798e6b10650bccb60df7ae498f5d1fa08c1 Apr 16 14:51:58.749103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:58.749069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerStarted","Data":"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70"} Apr 16 14:51:58.749620 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:58.749131 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerStarted","Data":"046b6786122f82f609ed2e7d25ee9798e6b10650bccb60df7ae498f5d1fa08c1"} Apr 16 14:51:59.985161 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:59.985139 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:51:59.998569 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:59.998546 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location\") pod \"c1b260d6-d01f-443a-8d45-fd16cc366765\" (UID: \"c1b260d6-d01f-443a-8d45-fd16cc366765\") " Apr 16 14:51:59.998920 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:51:59.998895 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c1b260d6-d01f-443a-8d45-fd16cc366765" (UID: "c1b260d6-d01f-443a-8d45-fd16cc366765"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:52:00.099226 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.099130 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c1b260d6-d01f-443a-8d45-fd16cc366765-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:52:00.756852 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.756816 2574 generic.go:358] "Generic (PLEG): container finished" podID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerID="c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394" exitCode=0 Apr 16 14:52:00.757059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.756890 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" Apr 16 14:52:00.757059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.756940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerDied","Data":"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394"} Apr 16 14:52:00.757059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.756985 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf" event={"ID":"c1b260d6-d01f-443a-8d45-fd16cc366765","Type":"ContainerDied","Data":"a0e3fe1b57371eb5c6dd4e2b19d74db94cf0a173e3f2c657ee2ae4de5367ce4d"} Apr 16 14:52:00.757059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.757004 2574 scope.go:117] "RemoveContainer" containerID="c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394" Apr 16 14:52:00.765166 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.765150 2574 scope.go:117] "RemoveContainer" containerID="548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f" Apr 16 14:52:00.771968 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.771949 2574 scope.go:117] "RemoveContainer" containerID="c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394" Apr 16 14:52:00.772230 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:52:00.772210 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394\": container with ID starting with c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394 not found: ID does not exist" containerID="c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394" Apr 16 14:52:00.772288 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.772240 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394"} err="failed to get container status \"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394\": rpc error: code = NotFound desc = could not find container \"c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394\": container with ID starting with c50a948534eccd8b6748da6bc1365cd78c262ae18b6338a62a21bd9bb4711394 not found: ID does not exist" Apr 16 14:52:00.772288 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.772259 2574 scope.go:117] "RemoveContainer" containerID="548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f" Apr 16 14:52:00.772473 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:52:00.772459 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f\": container with ID starting with 548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f not found: ID does not exist" containerID="548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f" Apr 16 14:52:00.772518 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.772476 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f"} err="failed to get container status \"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f\": rpc error: code = NotFound desc = could not find container \"548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f\": container with ID starting with 548474ff9561bd559f5e7bcaf45bf1a792b54fc61cc4d026a435e43d8e23054f not found: ID does not exist" Apr 16 14:52:00.782905 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.782884 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:52:00.790059 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:00.790035 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ddwrf"] Apr 16 14:52:01.187539 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:01.187507 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" path="/var/lib/kubelet/pods/c1b260d6-d01f-443a-8d45-fd16cc366765/volumes" Apr 16 14:52:02.765048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:02.764966 2574 generic.go:358] "Generic (PLEG): container finished" podID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerID="31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70" exitCode=0 Apr 16 14:52:02.765406 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:02.765040 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerDied","Data":"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70"} Apr 16 14:52:21.827146 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:21.827109 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerStarted","Data":"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91"} Apr 16 14:52:21.827575 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:21.827392 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:52:21.828717 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:21.828691 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:52:21.844389 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:21.844328 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podStartSLOduration=6.854564107 podStartE2EDuration="24.844315238s" podCreationTimestamp="2026-04-16 14:51:57 +0000 UTC" firstStartedPulling="2026-04-16 14:52:02.76624165 +0000 UTC m=+3164.127784396" lastFinishedPulling="2026-04-16 14:52:20.755992781 +0000 UTC m=+3182.117535527" observedRunningTime="2026-04-16 14:52:21.842863269 +0000 UTC m=+3183.204406037" watchObservedRunningTime="2026-04-16 14:52:21.844315238 +0000 UTC m=+3183.205858005" Apr 16 14:52:22.829838 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:22.829807 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:52:32.830365 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:32.830321 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:52:42.830353 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:42.830309 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:52:52.830439 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:52:52.830388 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:53:02.830392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:02.830347 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:53:12.830352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:12.830312 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.59:8080: connect: connection refused" Apr 16 14:53:22.831776 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:22.831742 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:53:27.967668 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:27.967634 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:53:27.968180 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:27.967906 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" containerID="cri-o://d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91" gracePeriod=30 Apr 16 14:53:28.116877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.116844 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:53:28.117150 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.117139 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" Apr 16 14:53:28.117212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.117152 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" Apr 16 14:53:28.117212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.117169 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="storage-initializer" Apr 16 14:53:28.117212 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.117182 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="storage-initializer" Apr 16 14:53:28.117309 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.117227 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1b260d6-d01f-443a-8d45-fd16cc366765" containerName="kserve-container" Apr 16 14:53:28.120488 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.120469 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:28.132407 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.132383 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:53:28.220608 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.220488 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd\" (UID: \"821cbdb7-5f44-4358-ba67-4b735d707ca9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:28.321922 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.321881 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd\" (UID: \"821cbdb7-5f44-4358-ba67-4b735d707ca9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:28.322289 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.322266 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd\" (UID: \"821cbdb7-5f44-4358-ba67-4b735d707ca9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:28.430208 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.430168 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:28.551476 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.551449 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:53:28.554342 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:53:28.554312 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod821cbdb7_5f44_4358_ba67_4b735d707ca9.slice/crio-e87a62d8475ebaba797ec01c8d98d48d6a02dbfa5822b7cc74413f08bcdecbdc WatchSource:0}: Error finding container e87a62d8475ebaba797ec01c8d98d48d6a02dbfa5822b7cc74413f08bcdecbdc: Status 404 returned error can't find the container with id e87a62d8475ebaba797ec01c8d98d48d6a02dbfa5822b7cc74413f08bcdecbdc Apr 16 14:53:28.556634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:28.556618 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:53:29.018725 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:29.018689 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerStarted","Data":"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806"} Apr 16 14:53:29.018725 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:29.018731 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerStarted","Data":"e87a62d8475ebaba797ec01c8d98d48d6a02dbfa5822b7cc74413f08bcdecbdc"} Apr 16 14:53:31.708395 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:31.708363 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:53:31.854855 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:31.854820 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location\") pod \"bff50c9e-a3d3-4717-9e41-e30fbcb787b7\" (UID: \"bff50c9e-a3d3-4717-9e41-e30fbcb787b7\") " Apr 16 14:53:31.855154 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:31.855131 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bff50c9e-a3d3-4717-9e41-e30fbcb787b7" (UID: "bff50c9e-a3d3-4717-9e41-e30fbcb787b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:53:31.956219 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:31.956180 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bff50c9e-a3d3-4717-9e41-e30fbcb787b7-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:53:32.028637 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.028572 2574 generic.go:358] "Generic (PLEG): container finished" podID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerID="d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91" exitCode=0 Apr 16 14:53:32.028833 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.028670 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" Apr 16 14:53:32.028833 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.028678 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerDied","Data":"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91"} Apr 16 14:53:32.028833 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.028719 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst" event={"ID":"bff50c9e-a3d3-4717-9e41-e30fbcb787b7","Type":"ContainerDied","Data":"046b6786122f82f609ed2e7d25ee9798e6b10650bccb60df7ae498f5d1fa08c1"} Apr 16 14:53:32.028833 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.028734 2574 scope.go:117] "RemoveContainer" containerID="d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91" Apr 16 14:53:32.037698 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.037677 2574 scope.go:117] "RemoveContainer" containerID="31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70" Apr 16 14:53:32.044877 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.044856 2574 scope.go:117] "RemoveContainer" containerID="d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91" Apr 16 14:53:32.045158 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:53:32.045138 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91\": container with ID starting with d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91 not found: ID does not exist" containerID="d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91" Apr 16 14:53:32.045257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.045169 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91"} err="failed to get container status \"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91\": rpc error: code = NotFound desc = could not find container \"d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91\": container with ID starting with d8523d5e493d54706c307dc461ebd1e37591782c77e6273400073c37ce5dea91 not found: ID does not exist" Apr 16 14:53:32.045257 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.045204 2574 scope.go:117] "RemoveContainer" containerID="31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70" Apr 16 14:53:32.045478 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:53:32.045460 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70\": container with ID starting with 31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70 not found: ID does not exist" containerID="31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70" Apr 16 14:53:32.045520 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.045484 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70"} err="failed to get container status \"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70\": rpc error: code = NotFound desc = could not find container \"31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70\": container with ID starting with 31f68ec65c93c7ca51fd7879fda01ccaae536f19fca838a86e971c45d98a9e70 not found: ID does not exist" Apr 16 14:53:32.048817 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.048789 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:53:32.052452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:32.052425 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-dlvst"] Apr 16 14:53:33.033617 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:33.033570 2574 generic.go:358] "Generic (PLEG): container finished" podID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerID="bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806" exitCode=0 Apr 16 14:53:33.033967 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:33.033645 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerDied","Data":"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806"} Apr 16 14:53:33.187441 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:33.187410 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" path="/var/lib/kubelet/pods/bff50c9e-a3d3-4717-9e41-e30fbcb787b7/volumes" Apr 16 14:53:34.041756 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:34.041720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerStarted","Data":"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0"} Apr 16 14:53:34.042130 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:34.041973 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:53:34.057103 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:53:34.057049 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" podStartSLOduration=6.057033342 podStartE2EDuration="6.057033342s" podCreationTimestamp="2026-04-16 14:53:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:53:34.056339542 +0000 UTC m=+3255.417882314" watchObservedRunningTime="2026-04-16 14:53:34.057033342 +0000 UTC m=+3255.418576109" Apr 16 14:54:05.045913 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:05.045873 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.60:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.133.0.60:8080: connect: connection refused" Apr 16 14:54:15.048670 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:15.048634 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:54:18.120378 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.120345 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:54:18.120802 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.120610 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="kserve-container" containerID="cri-o://b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0" gracePeriod=30 Apr 16 14:54:18.175159 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175125 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:54:18.175506 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175489 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="storage-initializer" Apr 16 14:54:18.175567 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175509 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="storage-initializer" Apr 16 14:54:18.175567 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175523 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" Apr 16 14:54:18.175567 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175529 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" Apr 16 14:54:18.175716 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.175609 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bff50c9e-a3d3-4717-9e41-e30fbcb787b7" containerName="kserve-container" Apr 16 14:54:18.178739 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.178719 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:18.187441 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.187413 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:54:18.218944 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.218914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-mtrsr\" (UID: \"fc13d5dc-324d-47ef-b115-29a94593711c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:18.320336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.320306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-mtrsr\" (UID: \"fc13d5dc-324d-47ef-b115-29a94593711c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:18.320644 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.320627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-mtrsr\" (UID: \"fc13d5dc-324d-47ef-b115-29a94593711c\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:18.489528 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.489434 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:18.671228 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:18.671192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:54:18.674186 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:54:18.674160 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc13d5dc_324d_47ef_b115_29a94593711c.slice/crio-77f975f9d6ec2faca18193023a0ec4020a6fa963fbaf027c50715d1d09ef4888 WatchSource:0}: Error finding container 77f975f9d6ec2faca18193023a0ec4020a6fa963fbaf027c50715d1d09ef4888: Status 404 returned error can't find the container with id 77f975f9d6ec2faca18193023a0ec4020a6fa963fbaf027c50715d1d09ef4888 Apr 16 14:54:19.181767 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:19.181730 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerStarted","Data":"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187"} Apr 16 14:54:19.181767 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:19.181766 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerStarted","Data":"77f975f9d6ec2faca18193023a0ec4020a6fa963fbaf027c50715d1d09ef4888"} Apr 16 14:54:23.196264 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:23.196228 2574 generic.go:358] "Generic (PLEG): container finished" podID="fc13d5dc-324d-47ef-b115-29a94593711c" containerID="fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187" exitCode=0 Apr 16 14:54:23.196763 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:23.196298 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerDied","Data":"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187"} Apr 16 14:54:24.205997 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.205959 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerStarted","Data":"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea"} Apr 16 14:54:24.206432 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.206176 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:24.223537 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.223482 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" podStartSLOduration=6.223469343 podStartE2EDuration="6.223469343s" podCreationTimestamp="2026-04-16 14:54:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:54:24.221722776 +0000 UTC m=+3305.583265543" watchObservedRunningTime="2026-04-16 14:54:24.223469343 +0000 UTC m=+3305.585012106" Apr 16 14:54:24.888649 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.888571 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:54:24.973457 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.973416 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location\") pod \"821cbdb7-5f44-4358-ba67-4b735d707ca9\" (UID: \"821cbdb7-5f44-4358-ba67-4b735d707ca9\") " Apr 16 14:54:24.973812 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:24.973789 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "821cbdb7-5f44-4358-ba67-4b735d707ca9" (UID: "821cbdb7-5f44-4358-ba67-4b735d707ca9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:54:25.074312 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.074270 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/821cbdb7-5f44-4358-ba67-4b735d707ca9-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:54:25.210319 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.210285 2574 generic.go:358] "Generic (PLEG): container finished" podID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerID="b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0" exitCode=0 Apr 16 14:54:25.210747 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.210352 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" Apr 16 14:54:25.210747 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.210366 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerDied","Data":"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0"} Apr 16 14:54:25.210747 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.210428 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd" event={"ID":"821cbdb7-5f44-4358-ba67-4b735d707ca9","Type":"ContainerDied","Data":"e87a62d8475ebaba797ec01c8d98d48d6a02dbfa5822b7cc74413f08bcdecbdc"} Apr 16 14:54:25.210747 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.210451 2574 scope.go:117] "RemoveContainer" containerID="b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0" Apr 16 14:54:25.218666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.218631 2574 scope.go:117] "RemoveContainer" containerID="bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806" Apr 16 14:54:25.224660 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.224637 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:54:25.226073 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.226061 2574 scope.go:117] "RemoveContainer" containerID="b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0" Apr 16 14:54:25.226341 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:54:25.226320 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0\": container with ID starting with b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0 not found: ID does not exist" containerID="b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0" Apr 16 14:54:25.226406 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.226357 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0"} err="failed to get container status \"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0\": rpc error: code = NotFound desc = could not find container \"b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0\": container with ID starting with b045a4a4c8eafdefddd80aca506f2f65d4b1ef7bf5002f73d477ae448848a9e0 not found: ID does not exist" Apr 16 14:54:25.226406 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.226379 2574 scope.go:117] "RemoveContainer" containerID="bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806" Apr 16 14:54:25.226649 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:54:25.226631 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806\": container with ID starting with bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806 not found: ID does not exist" containerID="bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806" Apr 16 14:54:25.226721 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.226654 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806"} err="failed to get container status \"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806\": rpc error: code = NotFound desc = could not find container \"bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806\": container with ID starting with bd46ccba06b7409d3fcc948041cb2db6a1e52b7876d0d07791f34d93df727806 not found: ID does not exist" Apr 16 14:54:25.230356 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:25.230332 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-6c8wd"] Apr 16 14:54:27.186811 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:27.186774 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" path="/var/lib/kubelet/pods/821cbdb7-5f44-4358-ba67-4b735d707ca9/volumes" Apr 16 14:54:42.287666 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:42.287573 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:54:42.297376 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:42.297352 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:54:55.231821 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:55.231789 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:54:58.493490 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.493458 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:54:58.493940 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.493732 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="kserve-container" containerID="cri-o://23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea" gracePeriod=30 Apr 16 14:54:58.535934 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.535887 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:54:58.536263 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.536246 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="storage-initializer" Apr 16 14:54:58.536345 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.536265 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="storage-initializer" Apr 16 14:54:58.536345 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.536276 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="kserve-container" Apr 16 14:54:58.536345 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.536284 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="kserve-container" Apr 16 14:54:58.536504 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.536363 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="821cbdb7-5f44-4358-ba67-4b735d707ca9" containerName="kserve-container" Apr 16 14:54:58.539826 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.539801 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:54:58.545954 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.545927 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:54:58.660634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.660573 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-xdmtw\" (UID: \"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:54:58.762116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.762004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-xdmtw\" (UID: \"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:54:58.762431 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.762413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-xdmtw\" (UID: \"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:54:58.850575 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.850537 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:54:58.971331 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:58.971299 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:54:58.975486 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:54:58.975456 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcad21c30_a3fe_4f2b_bfd4_b9d2efacc3f3.slice/crio-37193c761fc99ab7c87eddb64743c41c21fa34c2034d0cf6dbc473ff1cf25716 WatchSource:0}: Error finding container 37193c761fc99ab7c87eddb64743c41c21fa34c2034d0cf6dbc473ff1cf25716: Status 404 returned error can't find the container with id 37193c761fc99ab7c87eddb64743c41c21fa34c2034d0cf6dbc473ff1cf25716 Apr 16 14:54:59.309637 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:59.309519 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerStarted","Data":"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3"} Apr 16 14:54:59.309637 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:54:59.309553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerStarted","Data":"37193c761fc99ab7c87eddb64743c41c21fa34c2034d0cf6dbc473ff1cf25716"} Apr 16 14:55:03.323867 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:03.323830 2574 generic.go:358] "Generic (PLEG): container finished" podID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerID="97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3" exitCode=0 Apr 16 14:55:03.324241 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:03.323904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerDied","Data":"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3"} Apr 16 14:55:04.328760 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:04.328724 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerStarted","Data":"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715"} Apr 16 14:55:04.329225 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:04.329022 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:55:04.330602 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:04.330550 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:04.346132 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:04.346081 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podStartSLOduration=6.346066283 podStartE2EDuration="6.346066283s" podCreationTimestamp="2026-04-16 14:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:55:04.344220147 +0000 UTC m=+3345.705762913" watchObservedRunningTime="2026-04-16 14:55:04.346066283 +0000 UTC m=+3345.707609051" Apr 16 14:55:05.212012 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:05.211973 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.61:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.133.0.61:8080: connect: connection refused" Apr 16 14:55:05.332274 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:05.332241 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:05.936730 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:05.936709 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:55:06.008609 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.008485 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location\") pod \"fc13d5dc-324d-47ef-b115-29a94593711c\" (UID: \"fc13d5dc-324d-47ef-b115-29a94593711c\") " Apr 16 14:55:06.008882 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.008861 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc13d5dc-324d-47ef-b115-29a94593711c" (UID: "fc13d5dc-324d-47ef-b115-29a94593711c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:55:06.109706 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.109662 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc13d5dc-324d-47ef-b115-29a94593711c-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:55:06.336375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.336344 2574 generic.go:358] "Generic (PLEG): container finished" podID="fc13d5dc-324d-47ef-b115-29a94593711c" containerID="23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea" exitCode=0 Apr 16 14:55:06.336866 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.336408 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" Apr 16 14:55:06.336866 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.336409 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerDied","Data":"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea"} Apr 16 14:55:06.336866 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.336552 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr" event={"ID":"fc13d5dc-324d-47ef-b115-29a94593711c","Type":"ContainerDied","Data":"77f975f9d6ec2faca18193023a0ec4020a6fa963fbaf027c50715d1d09ef4888"} Apr 16 14:55:06.336866 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.336575 2574 scope.go:117] "RemoveContainer" containerID="23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea" Apr 16 14:55:06.344117 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.344089 2574 scope.go:117] "RemoveContainer" containerID="fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187" Apr 16 14:55:06.350938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.350919 2574 scope.go:117] "RemoveContainer" containerID="23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea" Apr 16 14:55:06.351181 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:55:06.351162 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea\": container with ID starting with 23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea not found: ID does not exist" containerID="23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea" Apr 16 14:55:06.351244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.351190 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea"} err="failed to get container status \"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea\": rpc error: code = NotFound desc = could not find container \"23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea\": container with ID starting with 23b0ac46a82118b5800cab70ef832d329b439d770364efeee41af4fe7c56a6ea not found: ID does not exist" Apr 16 14:55:06.351244 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.351208 2574 scope.go:117] "RemoveContainer" containerID="fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187" Apr 16 14:55:06.351429 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:55:06.351411 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187\": container with ID starting with fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187 not found: ID does not exist" containerID="fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187" Apr 16 14:55:06.351470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.351437 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187"} err="failed to get container status \"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187\": rpc error: code = NotFound desc = could not find container \"fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187\": container with ID starting with fcb31378103b8112bab2212b3a081c3c984e6ccfb38e106dfaf93d7c728b0187 not found: ID does not exist" Apr 16 14:55:06.355870 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.355848 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:55:06.359362 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:06.359343 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-mtrsr"] Apr 16 14:55:07.187265 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:07.187223 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" path="/var/lib/kubelet/pods/fc13d5dc-324d-47ef-b115-29a94593711c/volumes" Apr 16 14:55:15.333348 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:15.333297 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:25.332896 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:25.332851 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:35.332366 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:35.332323 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:45.333095 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:45.333054 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:55:55.332981 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:55:55.332937 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.62:8080: connect: connection refused" Apr 16 14:56:05.333355 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:05.333324 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:56:08.651019 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.650975 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:56:08.651393 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.651272 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" containerID="cri-o://a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715" gracePeriod=30 Apr 16 14:56:08.748318 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748280 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:56:08.748715 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748662 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="kserve-container" Apr 16 14:56:08.748715 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748680 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="kserve-container" Apr 16 14:56:08.748715 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748692 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="storage-initializer" Apr 16 14:56:08.748715 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748698 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="storage-initializer" Apr 16 14:56:08.748903 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.748772 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc13d5dc-324d-47ef-b115-29a94593711c" containerName="kserve-container" Apr 16 14:56:08.751890 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.751872 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:08.759880 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.759855 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:56:08.808986 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.808957 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt\" (UID: \"b040e0a2-ef6c-4077-a18c-dc980cd46cf9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:08.910278 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.910165 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt\" (UID: \"b040e0a2-ef6c-4077-a18c-dc980cd46cf9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:08.910548 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:08.910528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt\" (UID: \"b040e0a2-ef6c-4077-a18c-dc980cd46cf9\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:09.062336 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:09.062293 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:09.187094 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:56:09.187051 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb040e0a2_ef6c_4077_a18c_dc980cd46cf9.slice/crio-c16b33afc24295f20629c0511b56fd8fcd19272e1b7a61a089751d6b584b0b49 WatchSource:0}: Error finding container c16b33afc24295f20629c0511b56fd8fcd19272e1b7a61a089751d6b584b0b49: Status 404 returned error can't find the container with id c16b33afc24295f20629c0511b56fd8fcd19272e1b7a61a089751d6b584b0b49 Apr 16 14:56:09.187414 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:09.187394 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:56:09.524643 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:09.524548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerStarted","Data":"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129"} Apr 16 14:56:09.524643 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:09.524610 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerStarted","Data":"c16b33afc24295f20629c0511b56fd8fcd19272e1b7a61a089751d6b584b0b49"} Apr 16 14:56:12.388873 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.388844 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:56:12.436082 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.436048 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location\") pod \"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3\" (UID: \"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3\") " Apr 16 14:56:12.436364 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.436341 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" (UID: "cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:56:12.538828 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.538726 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:56:12.538980 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.538954 2574 generic.go:358] "Generic (PLEG): container finished" podID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerID="a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715" exitCode=0 Apr 16 14:56:12.539066 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.539042 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerDied","Data":"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715"} Apr 16 14:56:12.539127 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.539059 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" Apr 16 14:56:12.539127 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.539092 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw" event={"ID":"cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3","Type":"ContainerDied","Data":"37193c761fc99ab7c87eddb64743c41c21fa34c2034d0cf6dbc473ff1cf25716"} Apr 16 14:56:12.539127 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.539115 2574 scope.go:117] "RemoveContainer" containerID="a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715" Apr 16 14:56:12.546788 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.546773 2574 scope.go:117] "RemoveContainer" containerID="97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3" Apr 16 14:56:12.553687 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.553672 2574 scope.go:117] "RemoveContainer" containerID="a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715" Apr 16 14:56:12.553951 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:56:12.553928 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715\": container with ID starting with a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715 not found: ID does not exist" containerID="a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715" Apr 16 14:56:12.554042 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.553964 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715"} err="failed to get container status \"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715\": rpc error: code = NotFound desc = could not find container \"a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715\": container with ID starting with a240770d95e053bf18c58d55a6d48bdb41b540c1f1fb8c3073368a84ee7e9715 not found: ID does not exist" Apr 16 14:56:12.554042 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.553988 2574 scope.go:117] "RemoveContainer" containerID="97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3" Apr 16 14:56:12.554236 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:56:12.554220 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3\": container with ID starting with 97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3 not found: ID does not exist" containerID="97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3" Apr 16 14:56:12.554285 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.554242 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3"} err="failed to get container status \"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3\": rpc error: code = NotFound desc = could not find container \"97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3\": container with ID starting with 97aae6ab87b6a08728ff525afb3af0834a2b51b6f78ac4eebf454af7fc400fc3 not found: ID does not exist" Apr 16 14:56:12.559850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.559827 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:56:12.563714 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:12.563692 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-xdmtw"] Apr 16 14:56:13.186876 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:13.186833 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" path="/var/lib/kubelet/pods/cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3/volumes" Apr 16 14:56:13.542945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:13.542856 2574 generic.go:358] "Generic (PLEG): container finished" podID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerID="3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129" exitCode=0 Apr 16 14:56:13.542945 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:13.542932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerDied","Data":"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129"} Apr 16 14:56:14.548281 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:14.548249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerStarted","Data":"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6"} Apr 16 14:56:14.548800 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:14.548488 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:14.565458 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:14.565409 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" podStartSLOduration=6.565394921 podStartE2EDuration="6.565394921s" podCreationTimestamp="2026-04-16 14:56:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:56:14.564087299 +0000 UTC m=+3415.925630090" watchObservedRunningTime="2026-04-16 14:56:14.565394921 +0000 UTC m=+3415.926937689" Apr 16 14:56:45.631565 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:45.631519 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 14:56:55.554252 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:55.554219 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:56:58.920080 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.920043 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:56:58.920552 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.920393 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" containerID="cri-o://3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6" gracePeriod=30 Apr 16 14:56:58.923182 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923156 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:56:58.923452 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923438 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" Apr 16 14:56:58.923503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923455 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" Apr 16 14:56:58.923503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923470 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="storage-initializer" Apr 16 14:56:58.923503 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923476 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="storage-initializer" Apr 16 14:56:58.923631 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.923534 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="cad21c30-a3fe-4f2b-bfd4-b9d2efacc3f3" containerName="kserve-container" Apr 16 14:56:58.926482 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.926465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:56:58.937540 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:58.937516 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:56:59.015315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.015279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-75sz7\" (UID: \"0fbb6cd7-df93-408b-8513-ba0ff064c5a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:56:59.116198 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.116159 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-75sz7\" (UID: \"0fbb6cd7-df93-408b-8513-ba0ff064c5a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:56:59.116535 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.116518 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-75sz7\" (UID: \"0fbb6cd7-df93-408b-8513-ba0ff064c5a2\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:56:59.236246 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.236161 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:56:59.360254 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.360219 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:56:59.363382 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:56:59.363354 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fbb6cd7_df93_408b_8513_ba0ff064c5a2.slice/crio-48837d5b3ed08975aa83a17af39e941f68418a21f58e5a32495081092813ea46 WatchSource:0}: Error finding container 48837d5b3ed08975aa83a17af39e941f68418a21f58e5a32495081092813ea46: Status 404 returned error can't find the container with id 48837d5b3ed08975aa83a17af39e941f68418a21f58e5a32495081092813ea46 Apr 16 14:56:59.682361 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.682324 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerStarted","Data":"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20"} Apr 16 14:56:59.682361 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:56:59.682362 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerStarted","Data":"48837d5b3ed08975aa83a17af39e941f68418a21f58e5a32495081092813ea46"} Apr 16 14:57:03.696146 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:03.696110 2574 generic.go:358] "Generic (PLEG): container finished" podID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerID="86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20" exitCode=0 Apr 16 14:57:03.696537 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:03.696184 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerDied","Data":"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20"} Apr 16 14:57:04.701030 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:04.700995 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerStarted","Data":"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309"} Apr 16 14:57:04.701422 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:04.701292 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:57:04.702716 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:04.702689 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:04.718085 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:04.718040 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podStartSLOduration=6.718025785 podStartE2EDuration="6.718025785s" podCreationTimestamp="2026-04-16 14:56:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:57:04.717085993 +0000 UTC m=+3466.078628763" watchObservedRunningTime="2026-04-16 14:57:04.718025785 +0000 UTC m=+3466.079568554" Apr 16 14:57:05.552262 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:05.552218 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.133.0.63:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.133.0.63:8080: connect: connection refused" Apr 16 14:57:05.703776 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:05.703734 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:06.662218 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.662191 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:57:06.707509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.707424 2574 generic.go:358] "Generic (PLEG): container finished" podID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerID="3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6" exitCode=0 Apr 16 14:57:06.707509 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.707492 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" Apr 16 14:57:06.708002 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.707515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerDied","Data":"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6"} Apr 16 14:57:06.708002 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.707553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt" event={"ID":"b040e0a2-ef6c-4077-a18c-dc980cd46cf9","Type":"ContainerDied","Data":"c16b33afc24295f20629c0511b56fd8fcd19272e1b7a61a089751d6b584b0b49"} Apr 16 14:57:06.708002 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.707571 2574 scope.go:117] "RemoveContainer" containerID="3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6" Apr 16 14:57:06.717463 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.717441 2574 scope.go:117] "RemoveContainer" containerID="3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129" Apr 16 14:57:06.724501 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.724481 2574 scope.go:117] "RemoveContainer" containerID="3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6" Apr 16 14:57:06.724778 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:57:06.724757 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6\": container with ID starting with 3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6 not found: ID does not exist" containerID="3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6" Apr 16 14:57:06.724850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.724789 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6"} err="failed to get container status \"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6\": rpc error: code = NotFound desc = could not find container \"3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6\": container with ID starting with 3e483302da1cdec178f73251d4bb3f27262ff925052f8b5a588fe094c7cb47b6 not found: ID does not exist" Apr 16 14:57:06.724850 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.724811 2574 scope.go:117] "RemoveContainer" containerID="3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129" Apr 16 14:57:06.725056 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:57:06.725039 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129\": container with ID starting with 3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129 not found: ID does not exist" containerID="3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129" Apr 16 14:57:06.725096 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.725062 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129"} err="failed to get container status \"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129\": rpc error: code = NotFound desc = could not find container \"3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129\": container with ID starting with 3a54a1eab492866bf287a0db1f5db0c5ddc287ed7b526cfd242be34c07766129 not found: ID does not exist" Apr 16 14:57:06.779600 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.779542 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location\") pod \"b040e0a2-ef6c-4077-a18c-dc980cd46cf9\" (UID: \"b040e0a2-ef6c-4077-a18c-dc980cd46cf9\") " Apr 16 14:57:06.779900 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.779879 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b040e0a2-ef6c-4077-a18c-dc980cd46cf9" (UID: "b040e0a2-ef6c-4077-a18c-dc980cd46cf9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:57:06.880963 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:06.880916 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b040e0a2-ef6c-4077-a18c-dc980cd46cf9-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:57:07.027315 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:07.027281 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:57:07.032714 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:07.032685 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-kk5zt"] Apr 16 14:57:07.186899 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:07.186865 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" path="/var/lib/kubelet/pods/b040e0a2-ef6c-4077-a18c-dc980cd46cf9/volumes" Apr 16 14:57:15.704634 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:15.704554 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:25.703762 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:25.703716 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:35.704699 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:35.704652 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:45.704810 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:45.704717 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:57:55.704263 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:57:55.704221 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.64:8080: connect: connection refused" Apr 16 14:58:05.705458 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:05.705427 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:58:09.039710 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.039673 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:58:09.040178 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.040067 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" containerID="cri-o://0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309" gracePeriod=30 Apr 16 14:58:09.103222 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103185 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:58:09.103514 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103501 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="storage-initializer" Apr 16 14:58:09.103559 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103515 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="storage-initializer" Apr 16 14:58:09.103559 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103534 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" Apr 16 14:58:09.103559 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103539 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" Apr 16 14:58:09.103702 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.103595 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="b040e0a2-ef6c-4077-a18c-dc980cd46cf9" containerName="kserve-container" Apr 16 14:58:09.106470 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.106454 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:09.108734 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.108703 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 14:58:09.114477 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.114446 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:58:09.157478 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.157442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f45dc59f6-6gg28\" (UID: \"51c02660-545e-40b0-9a84-ce041dadbcc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:09.258240 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.258204 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f45dc59f6-6gg28\" (UID: \"51c02660-545e-40b0-9a84-ce041dadbcc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:09.258555 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.258538 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-5f45dc59f6-6gg28\" (UID: \"51c02660-545e-40b0-9a84-ce041dadbcc1\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:09.417027 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.416986 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:09.536158 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.536125 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:58:09.539330 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:58:09.539301 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51c02660_545e_40b0_9a84_ce041dadbcc1.slice/crio-91346ad111db1fbb7a571b176faab20893fc1b3bb2867e5cee9c47365420c460 WatchSource:0}: Error finding container 91346ad111db1fbb7a571b176faab20893fc1b3bb2867e5cee9c47365420c460: Status 404 returned error can't find the container with id 91346ad111db1fbb7a571b176faab20893fc1b3bb2867e5cee9c47365420c460 Apr 16 14:58:09.896194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.896156 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerStarted","Data":"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a"} Apr 16 14:58:09.896194 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:09.896194 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerStarted","Data":"91346ad111db1fbb7a571b176faab20893fc1b3bb2867e5cee9c47365420c460"} Apr 16 14:58:10.900659 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:10.900627 2574 generic.go:358] "Generic (PLEG): container finished" podID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerID="11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a" exitCode=0 Apr 16 14:58:10.901033 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:10.900692 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerDied","Data":"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a"} Apr 16 14:58:11.905144 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:11.905108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerStarted","Data":"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a"} Apr 16 14:58:11.905688 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:11.905308 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:58:11.906693 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:11.906667 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:58:11.921515 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:11.921471 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podStartSLOduration=2.921456257 podStartE2EDuration="2.921456257s" podCreationTimestamp="2026-04-16 14:58:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:58:11.919669598 +0000 UTC m=+3533.281212366" watchObservedRunningTime="2026-04-16 14:58:11.921456257 +0000 UTC m=+3533.282999025" Apr 16 14:58:12.722681 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.722658 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:58:12.788110 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.788025 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location\") pod \"0fbb6cd7-df93-408b-8513-ba0ff064c5a2\" (UID: \"0fbb6cd7-df93-408b-8513-ba0ff064c5a2\") " Apr 16 14:58:12.788353 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.788330 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0fbb6cd7-df93-408b-8513-ba0ff064c5a2" (UID: "0fbb6cd7-df93-408b-8513-ba0ff064c5a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:58:12.889297 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.889257 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0fbb6cd7-df93-408b-8513-ba0ff064c5a2-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:58:12.912591 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.912541 2574 generic.go:358] "Generic (PLEG): container finished" podID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerID="0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309" exitCode=0 Apr 16 14:58:12.913048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.912604 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerDied","Data":"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309"} Apr 16 14:58:12.913048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.912637 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" Apr 16 14:58:12.913048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.912649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7" event={"ID":"0fbb6cd7-df93-408b-8513-ba0ff064c5a2","Type":"ContainerDied","Data":"48837d5b3ed08975aa83a17af39e941f68418a21f58e5a32495081092813ea46"} Apr 16 14:58:12.913048 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.912669 2574 scope.go:117] "RemoveContainer" containerID="0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309" Apr 16 14:58:12.913392 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.913120 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:58:12.921290 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.921059 2574 scope.go:117] "RemoveContainer" containerID="86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20" Apr 16 14:58:12.928237 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.928221 2574 scope.go:117] "RemoveContainer" containerID="0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309" Apr 16 14:58:12.928479 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:58:12.928458 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309\": container with ID starting with 0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309 not found: ID does not exist" containerID="0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309" Apr 16 14:58:12.928524 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.928489 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309"} err="failed to get container status \"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309\": rpc error: code = NotFound desc = could not find container \"0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309\": container with ID starting with 0e11232099d6af41e688b4c9359d9befdb1c6d1740819d055ad295a8ffff1309 not found: ID does not exist" Apr 16 14:58:12.928524 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.928506 2574 scope.go:117] "RemoveContainer" containerID="86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20" Apr 16 14:58:12.928766 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:58:12.928750 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20\": container with ID starting with 86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20 not found: ID does not exist" containerID="86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20" Apr 16 14:58:12.928816 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.928772 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20"} err="failed to get container status \"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20\": rpc error: code = NotFound desc = could not find container \"86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20\": container with ID starting with 86dee7c870136a8dde420b2468bfe4fa97465305694733b9eee6811e3a8bbb20 not found: ID does not exist" Apr 16 14:58:12.935265 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.935244 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:58:12.940799 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:12.940775 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-75sz7"] Apr 16 14:58:13.187523 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:13.187489 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" path="/var/lib/kubelet/pods/0fbb6cd7-df93-408b-8513-ba0ff064c5a2/volumes" Apr 16 14:58:22.913853 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:22.913805 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:58:32.913923 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:32.913877 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:58:42.913765 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:42.913720 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:58:52.913310 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:58:52.913264 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:59:02.913232 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:02.913181 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:59:12.913664 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:12.913617 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.65:8080: connect: connection refused" Apr 16 14:59:17.187375 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:17.187345 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:59:19.233494 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.233457 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:59:19.233976 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.233823 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" containerID="cri-o://fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a" gracePeriod=30 Apr 16 14:59:19.359116 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359076 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 14:59:19.359417 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359405 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" Apr 16 14:59:19.359463 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359419 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" Apr 16 14:59:19.359463 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359434 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="storage-initializer" Apr 16 14:59:19.359463 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359440 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="storage-initializer" Apr 16 14:59:19.359561 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.359491 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="0fbb6cd7-df93-408b-8513-ba0ff064c5a2" containerName="kserve-container" Apr 16 14:59:19.362441 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.362423 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.364536 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.364518 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 14:59:19.369623 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.369537 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 14:59:19.431627 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.431567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.431811 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.431699 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.533130 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.533027 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.533323 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.533130 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.533482 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.533457 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.533841 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.533817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.673938 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.673900 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:19.797008 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.796852 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 14:59:19.799727 ip-10-0-139-151 kubenswrapper[2574]: W0416 14:59:19.799693 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b656b0b_6393_4f5e_801a_db8f9cc359a3.slice/crio-a4b39d25451fddc17498730c24e26355273798fc9118cdb6fa6432c2ed80f05e WatchSource:0}: Error finding container a4b39d25451fddc17498730c24e26355273798fc9118cdb6fa6432c2ed80f05e: Status 404 returned error can't find the container with id a4b39d25451fddc17498730c24e26355273798fc9118cdb6fa6432c2ed80f05e Apr 16 14:59:19.801317 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:19.801302 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:59:20.113473 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:20.113438 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerStarted","Data":"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d"} Apr 16 14:59:20.113473 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:20.113474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerStarted","Data":"a4b39d25451fddc17498730c24e26355273798fc9118cdb6fa6432c2ed80f05e"} Apr 16 14:59:21.117352 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:21.117314 2574 generic.go:358] "Generic (PLEG): container finished" podID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerID="5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d" exitCode=0 Apr 16 14:59:21.117879 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:21.117398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerDied","Data":"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d"} Apr 16 14:59:22.127003 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:22.126964 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerStarted","Data":"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf"} Apr 16 14:59:22.127478 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:22.127198 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 14:59:22.128633 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:22.128606 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 14:59:22.143565 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:22.143513 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podStartSLOduration=3.143495812 podStartE2EDuration="3.143495812s" podCreationTimestamp="2026-04-16 14:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:59:22.142141581 +0000 UTC m=+3603.503684368" watchObservedRunningTime="2026-04-16 14:59:22.143495812 +0000 UTC m=+3603.505038580" Apr 16 14:59:23.130914 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:23.130861 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 14:59:23.975089 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:23.975064 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:59:24.074196 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.074168 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location\") pod \"51c02660-545e-40b0-9a84-ce041dadbcc1\" (UID: \"51c02660-545e-40b0-9a84-ce041dadbcc1\") " Apr 16 14:59:24.074492 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.074472 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "51c02660-545e-40b0-9a84-ce041dadbcc1" (UID: "51c02660-545e-40b0-9a84-ce041dadbcc1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:59:24.135028 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.134993 2574 generic.go:358] "Generic (PLEG): container finished" podID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerID="fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a" exitCode=0 Apr 16 14:59:24.135398 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.135065 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" Apr 16 14:59:24.135398 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.135090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerDied","Data":"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a"} Apr 16 14:59:24.135398 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.135140 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28" event={"ID":"51c02660-545e-40b0-9a84-ce041dadbcc1","Type":"ContainerDied","Data":"91346ad111db1fbb7a571b176faab20893fc1b3bb2867e5cee9c47365420c460"} Apr 16 14:59:24.135398 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.135163 2574 scope.go:117] "RemoveContainer" containerID="fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a" Apr 16 14:59:24.143162 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.143143 2574 scope.go:117] "RemoveContainer" containerID="11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a" Apr 16 14:59:24.150424 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.150407 2574 scope.go:117] "RemoveContainer" containerID="fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a" Apr 16 14:59:24.150729 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:59:24.150708 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a\": container with ID starting with fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a not found: ID does not exist" containerID="fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a" Apr 16 14:59:24.150820 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.150738 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a"} err="failed to get container status \"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a\": rpc error: code = NotFound desc = could not find container \"fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a\": container with ID starting with fb9d7e1787348815e134fd1f47812722c95fbe911f71d43dae5d31c243633e9a not found: ID does not exist" Apr 16 14:59:24.150820 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.150757 2574 scope.go:117] "RemoveContainer" containerID="11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a" Apr 16 14:59:24.150992 ip-10-0-139-151 kubenswrapper[2574]: E0416 14:59:24.150976 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a\": container with ID starting with 11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a not found: ID does not exist" containerID="11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a" Apr 16 14:59:24.151034 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.150997 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a"} err="failed to get container status \"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a\": rpc error: code = NotFound desc = could not find container \"11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a\": container with ID starting with 11ddf514b4a981698c0bdc1b7de8c3650e79e28b57546eed764d90ca0737277a not found: ID does not exist" Apr 16 14:59:24.155569 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.155536 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:59:24.158830 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.158803 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-5f45dc59f6-6gg28"] Apr 16 14:59:24.174742 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:24.174710 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51c02660-545e-40b0-9a84-ce041dadbcc1-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 14:59:25.187152 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:25.187107 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" path="/var/lib/kubelet/pods/51c02660-545e-40b0-9a84-ce041dadbcc1/volumes" Apr 16 14:59:33.131773 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:33.131717 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 14:59:42.310730 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:42.310699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:59:42.320411 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:42.320386 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 14:59:43.131713 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:43.131668 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 14:59:53.131531 ip-10-0-139-151 kubenswrapper[2574]: I0416 14:59:53.131475 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 15:00:03.131071 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:03.131024 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 15:00:13.131046 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:13.130996 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 15:00:23.131274 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:23.131222 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.66:8080: connect: connection refused" Apr 16 15:00:28.184774 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:28.184741 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 15:00:29.407882 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:29.407844 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 15:00:29.408378 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:29.408187 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" containerID="cri-o://cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf" gracePeriod=30 Apr 16 15:00:30.501024 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.500968 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:30.501432 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.501336 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="storage-initializer" Apr 16 15:00:30.501432 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.501351 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="storage-initializer" Apr 16 15:00:30.501432 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.501360 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" Apr 16 15:00:30.501432 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.501365 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" Apr 16 15:00:30.501432 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.501422 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="51c02660-545e-40b0-9a84-ce041dadbcc1" containerName="kserve-container" Apr 16 15:00:30.505647 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.505626 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:30.517821 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.517793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:30.618761 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.618719 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq\" (UID: \"ebce2c53-6d65-4448-9186-b380fe733654\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:30.719711 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.719671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq\" (UID: \"ebce2c53-6d65-4448-9186-b380fe733654\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:30.720056 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.720036 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq\" (UID: \"ebce2c53-6d65-4448-9186-b380fe733654\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:30.815296 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.815208 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:30.950623 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:30.950572 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:30.953487 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:00:30.953457 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podebce2c53_6d65_4448_9186_b380fe733654.slice/crio-39321cc5b8848844ea6a69e8e999fcc9a6260acbc1ac0eea0f877d1c2a8d92a7 WatchSource:0}: Error finding container 39321cc5b8848844ea6a69e8e999fcc9a6260acbc1ac0eea0f877d1c2a8d92a7: Status 404 returned error can't find the container with id 39321cc5b8848844ea6a69e8e999fcc9a6260acbc1ac0eea0f877d1c2a8d92a7 Apr 16 15:00:31.328261 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:31.328222 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerStarted","Data":"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416"} Apr 16 15:00:31.328261 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:31.328265 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerStarted","Data":"39321cc5b8848844ea6a69e8e999fcc9a6260acbc1ac0eea0f877d1c2a8d92a7"} Apr 16 15:00:34.050702 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.050677 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 15:00:34.148337 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.148225 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location\") pod \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " Apr 16 15:00:34.148337 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.148298 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert\") pod \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\" (UID: \"3b656b0b-6393-4f5e-801a-db8f9cc359a3\") " Apr 16 15:00:34.148639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.148618 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3b656b0b-6393-4f5e-801a-db8f9cc359a3" (UID: "3b656b0b-6393-4f5e-801a-db8f9cc359a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:00:34.148709 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.148669 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "3b656b0b-6393-4f5e-801a-db8f9cc359a3" (UID: "3b656b0b-6393-4f5e-801a-db8f9cc359a3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:00:34.249275 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.249230 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b656b0b-6393-4f5e-801a-db8f9cc359a3-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:00:34.249275 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.249267 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/3b656b0b-6393-4f5e-801a-db8f9cc359a3-cabundle-cert\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:00:34.338723 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.338684 2574 generic.go:358] "Generic (PLEG): container finished" podID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerID="cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf" exitCode=0 Apr 16 15:00:34.338926 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.338785 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" Apr 16 15:00:34.338926 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.338777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerDied","Data":"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf"} Apr 16 15:00:34.338926 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.338827 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k" event={"ID":"3b656b0b-6393-4f5e-801a-db8f9cc359a3","Type":"ContainerDied","Data":"a4b39d25451fddc17498730c24e26355273798fc9118cdb6fa6432c2ed80f05e"} Apr 16 15:00:34.338926 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.338848 2574 scope.go:117] "RemoveContainer" containerID="cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf" Apr 16 15:00:34.346932 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.346811 2574 scope.go:117] "RemoveContainer" containerID="5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d" Apr 16 15:00:34.354355 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.354336 2574 scope.go:117] "RemoveContainer" containerID="cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf" Apr 16 15:00:34.354646 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:00:34.354620 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf\": container with ID starting with cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf not found: ID does not exist" containerID="cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf" Apr 16 15:00:34.354753 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.354650 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf"} err="failed to get container status \"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf\": rpc error: code = NotFound desc = could not find container \"cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf\": container with ID starting with cf174bd6de68aa1969d6cc659dfc4964ded1ea651c2f169085f9fbebb84358cf not found: ID does not exist" Apr 16 15:00:34.354753 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.354669 2574 scope.go:117] "RemoveContainer" containerID="5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d" Apr 16 15:00:34.354926 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:00:34.354902 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d\": container with ID starting with 5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d not found: ID does not exist" containerID="5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d" Apr 16 15:00:34.355011 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.354931 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d"} err="failed to get container status \"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d\": rpc error: code = NotFound desc = could not find container \"5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d\": container with ID starting with 5167ab210a7df9a5e18a86102505205f0a7aae3c558d2ae1f523a45e6cc7363d not found: ID does not exist" Apr 16 15:00:34.358710 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.358685 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 15:00:34.363469 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:34.363448 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-ccbfd99cb-jxs5k"] Apr 16 15:00:35.187497 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:35.187464 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" path="/var/lib/kubelet/pods/3b656b0b-6393-4f5e-801a-db8f9cc359a3/volumes" Apr 16 15:00:37.349714 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:37.349687 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/0.log" Apr 16 15:00:37.350113 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:37.349729 2574 generic.go:358] "Generic (PLEG): container finished" podID="ebce2c53-6d65-4448-9186-b380fe733654" containerID="83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416" exitCode=1 Apr 16 15:00:37.350113 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:37.349806 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerDied","Data":"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416"} Apr 16 15:00:38.354321 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:38.354288 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/0.log" Apr 16 15:00:38.354719 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:38.354418 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerStarted","Data":"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5"} Apr 16 15:00:40.493976 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:40.493943 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:40.494362 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:40.494190 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" containerID="cri-o://6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5" gracePeriod=30 Apr 16 15:00:41.576426 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576386 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:00:41.576836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576712 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="storage-initializer" Apr 16 15:00:41.576836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576725 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="storage-initializer" Apr 16 15:00:41.576836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576736 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" Apr 16 15:00:41.576836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576741 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" Apr 16 15:00:41.576836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.576786 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="3b656b0b-6393-4f5e-801a-db8f9cc359a3" containerName="kserve-container" Apr 16 15:00:41.579785 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.579767 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.582403 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.582381 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:00:41.587623 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.587599 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:00:41.713609 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.713548 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.713609 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.713610 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.815001 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.814963 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.815199 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.815006 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.815441 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.815419 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.815723 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.815706 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:41.891162 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:41.891076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:42.012832 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:42.012795 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:00:42.015848 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:00:42.015814 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod878c76bb_d6a6_427e_92d1_8156bee36488.slice/crio-baf6b6c3e8facab813e31eac9dfe8ba97f7f0315c097decb183080329aaa2423 WatchSource:0}: Error finding container baf6b6c3e8facab813e31eac9dfe8ba97f7f0315c097decb183080329aaa2423: Status 404 returned error can't find the container with id baf6b6c3e8facab813e31eac9dfe8ba97f7f0315c097decb183080329aaa2423 Apr 16 15:00:42.367084 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:42.367046 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerStarted","Data":"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45"} Apr 16 15:00:42.367084 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:42.367087 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerStarted","Data":"baf6b6c3e8facab813e31eac9dfe8ba97f7f0315c097decb183080329aaa2423"} Apr 16 15:00:43.235416 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.235390 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/1.log" Apr 16 15:00:43.235822 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.235758 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/0.log" Apr 16 15:00:43.235888 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.235832 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:43.328029 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.327993 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location\") pod \"ebce2c53-6d65-4448-9186-b380fe733654\" (UID: \"ebce2c53-6d65-4448-9186-b380fe733654\") " Apr 16 15:00:43.328311 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.328278 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ebce2c53-6d65-4448-9186-b380fe733654" (UID: "ebce2c53-6d65-4448-9186-b380fe733654"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:00:43.371003 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.370973 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/1.log" Apr 16 15:00:43.371337 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371320 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq_ebce2c53-6d65-4448-9186-b380fe733654/storage-initializer/0.log" Apr 16 15:00:43.371413 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371360 2574 generic.go:358] "Generic (PLEG): container finished" podID="ebce2c53-6d65-4448-9186-b380fe733654" containerID="6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5" exitCode=1 Apr 16 15:00:43.371413 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371389 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerDied","Data":"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5"} Apr 16 15:00:43.371506 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371420 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" event={"ID":"ebce2c53-6d65-4448-9186-b380fe733654","Type":"ContainerDied","Data":"39321cc5b8848844ea6a69e8e999fcc9a6260acbc1ac0eea0f877d1c2a8d92a7"} Apr 16 15:00:43.371506 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371431 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq" Apr 16 15:00:43.371506 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.371436 2574 scope.go:117] "RemoveContainer" containerID="6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5" Apr 16 15:00:43.372819 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.372795 2574 generic.go:358] "Generic (PLEG): container finished" podID="878c76bb-d6a6-427e-92d1-8156bee36488" containerID="4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45" exitCode=0 Apr 16 15:00:43.372932 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.372875 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerDied","Data":"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45"} Apr 16 15:00:43.380134 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.379712 2574 scope.go:117] "RemoveContainer" containerID="83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416" Apr 16 15:00:43.388863 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.388838 2574 scope.go:117] "RemoveContainer" containerID="6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5" Apr 16 15:00:43.389321 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:00:43.389299 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5\": container with ID starting with 6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5 not found: ID does not exist" containerID="6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5" Apr 16 15:00:43.389415 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.389333 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5"} err="failed to get container status \"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5\": rpc error: code = NotFound desc = could not find container \"6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5\": container with ID starting with 6cc098b18185026890d8c34be13dadda856d198a44cac35444f6e0972836c0c5 not found: ID does not exist" Apr 16 15:00:43.389415 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.389361 2574 scope.go:117] "RemoveContainer" containerID="83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416" Apr 16 15:00:43.389683 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:00:43.389659 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416\": container with ID starting with 83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416 not found: ID does not exist" containerID="83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416" Apr 16 15:00:43.389777 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.389692 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416"} err="failed to get container status \"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416\": rpc error: code = NotFound desc = could not find container \"83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416\": container with ID starting with 83f18b7ca1598137309a7bb4f9935f3304b6e13f8590542617d2251d5e393416 not found: ID does not exist" Apr 16 15:00:43.417454 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.417427 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:43.421930 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.421900 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-6cf57c4f9d-458gq"] Apr 16 15:00:43.429066 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:43.429041 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ebce2c53-6d65-4448-9186-b380fe733654-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:00:44.378149 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:44.378116 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerStarted","Data":"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd"} Apr 16 15:00:44.378555 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:44.378315 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:00:44.379265 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:44.379241 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:00:44.400649 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:44.400601 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podStartSLOduration=3.400575762 podStartE2EDuration="3.400575762s" podCreationTimestamp="2026-04-16 15:00:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:00:44.398825128 +0000 UTC m=+3685.760367897" watchObservedRunningTime="2026-04-16 15:00:44.400575762 +0000 UTC m=+3685.762118587" Apr 16 15:00:45.186875 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:45.186840 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebce2c53-6d65-4448-9186-b380fe733654" path="/var/lib/kubelet/pods/ebce2c53-6d65-4448-9186-b380fe733654/volumes" Apr 16 15:00:45.381066 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:45.381024 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:00:55.381007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:00:55.380962 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:05.381181 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:05.381130 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:15.381987 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:15.381938 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:25.380999 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:25.380949 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:35.381959 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:35.381908 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:45.381437 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:45.381387 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:01:55.382047 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:01:55.382006 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:02:01.618036 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:01.617998 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:02:01.618415 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:01.618275 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" containerID="cri-o://34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd" gracePeriod=30 Apr 16 15:02:02.681257 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681217 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681513 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681525 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681593 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681603 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681681 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.681696 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.681697 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="ebce2c53-6d65-4448-9186-b380fe733654" containerName="storage-initializer" Apr 16 15:02:02.684403 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.684388 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:02.693732 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.693704 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:02.780041 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.780000 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj\" (UID: \"bc429169-eec1-4d83-a3b7-a828411240ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:02.881315 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.881280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj\" (UID: \"bc429169-eec1-4d83-a3b7-a828411240ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:02.881728 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.881702 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj\" (UID: \"bc429169-eec1-4d83-a3b7-a828411240ec\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:02.995307 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:02.995210 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:03.116802 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:03.116766 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:03.121389 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:02:03.121362 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc429169_eec1_4d83_a3b7_a828411240ec.slice/crio-d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891 WatchSource:0}: Error finding container d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891: Status 404 returned error can't find the container with id d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891 Apr 16 15:02:03.613637 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:03.613592 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerStarted","Data":"9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7"} Apr 16 15:02:03.613637 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:03.613638 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerStarted","Data":"d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891"} Apr 16 15:02:05.381664 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:05.381621 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.68:8080: connect: connection refused" Apr 16 15:02:06.259922 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.259896 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:02:06.414172 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.414132 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location\") pod \"878c76bb-d6a6-427e-92d1-8156bee36488\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " Apr 16 15:02:06.414639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.414191 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert\") pod \"878c76bb-d6a6-427e-92d1-8156bee36488\" (UID: \"878c76bb-d6a6-427e-92d1-8156bee36488\") " Apr 16 15:02:06.414639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.414445 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "878c76bb-d6a6-427e-92d1-8156bee36488" (UID: "878c76bb-d6a6-427e-92d1-8156bee36488"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:06.414639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.414496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "878c76bb-d6a6-427e-92d1-8156bee36488" (UID: "878c76bb-d6a6-427e-92d1-8156bee36488"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:02:06.414639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.414501 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/878c76bb-d6a6-427e-92d1-8156bee36488-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:02:06.515793 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.515755 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/878c76bb-d6a6-427e-92d1-8156bee36488-cabundle-cert\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:02:06.626058 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.626014 2574 generic.go:358] "Generic (PLEG): container finished" podID="878c76bb-d6a6-427e-92d1-8156bee36488" containerID="34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd" exitCode=0 Apr 16 15:02:06.626232 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.626084 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" Apr 16 15:02:06.626232 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.626104 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerDied","Data":"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd"} Apr 16 15:02:06.626232 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.626143 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2" event={"ID":"878c76bb-d6a6-427e-92d1-8156bee36488","Type":"ContainerDied","Data":"baf6b6c3e8facab813e31eac9dfe8ba97f7f0315c097decb183080329aaa2423"} Apr 16 15:02:06.626232 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.626160 2574 scope.go:117] "RemoveContainer" containerID="34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd" Apr 16 15:02:06.634118 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.634102 2574 scope.go:117] "RemoveContainer" containerID="4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45" Apr 16 15:02:06.641790 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.641768 2574 scope.go:117] "RemoveContainer" containerID="34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd" Apr 16 15:02:06.642068 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:02:06.642047 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd\": container with ID starting with 34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd not found: ID does not exist" containerID="34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd" Apr 16 15:02:06.642131 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.642078 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd"} err="failed to get container status \"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd\": rpc error: code = NotFound desc = could not find container \"34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd\": container with ID starting with 34f6116d02385abfe8f5299d988e2b5f33b176f67b73eeabbb3570b782ab82cd not found: ID does not exist" Apr 16 15:02:06.642131 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.642097 2574 scope.go:117] "RemoveContainer" containerID="4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45" Apr 16 15:02:06.642342 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:02:06.642322 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45\": container with ID starting with 4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45 not found: ID does not exist" containerID="4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45" Apr 16 15:02:06.642384 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.642348 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45"} err="failed to get container status \"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45\": rpc error: code = NotFound desc = could not find container \"4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45\": container with ID starting with 4732332dafcc6b7f76bc1bdac06ef32927cf7af052f86a8f53c7b31fed07da45 not found: ID does not exist" Apr 16 15:02:06.647743 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.647716 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:02:06.652176 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:06.652150 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-968fc75cb-xbqd2"] Apr 16 15:02:07.186824 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:07.186732 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" path="/var/lib/kubelet/pods/878c76bb-d6a6-427e-92d1-8156bee36488/volumes" Apr 16 15:02:08.634097 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:08.634071 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/0.log" Apr 16 15:02:08.634471 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:08.634109 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc429169-eec1-4d83-a3b7-a828411240ec" containerID="9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7" exitCode=1 Apr 16 15:02:08.634471 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:08.634163 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerDied","Data":"9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7"} Apr 16 15:02:09.638691 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:09.638662 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/0.log" Apr 16 15:02:09.639101 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:09.638720 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerStarted","Data":"f561c1c8f6c535d9c948d942ba8c68dc5adf72d7cd74339f3539ed8e2792fa87"} Apr 16 15:02:10.643206 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.643178 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/1.log" Apr 16 15:02:10.643638 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.643530 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/0.log" Apr 16 15:02:10.643638 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.643564 2574 generic.go:358] "Generic (PLEG): container finished" podID="bc429169-eec1-4d83-a3b7-a828411240ec" containerID="f561c1c8f6c535d9c948d942ba8c68dc5adf72d7cd74339f3539ed8e2792fa87" exitCode=1 Apr 16 15:02:10.643638 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.643618 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerDied","Data":"f561c1c8f6c535d9c948d942ba8c68dc5adf72d7cd74339f3539ed8e2792fa87"} Apr 16 15:02:10.643806 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.643656 2574 scope.go:117] "RemoveContainer" containerID="9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7" Apr 16 15:02:10.644022 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:10.644004 2574 scope.go:117] "RemoveContainer" containerID="9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7" Apr 16 15:02:10.654107 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:02:10.654077 2574 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_kserve-ci-e2e-test_bc429169-eec1-4d83-a3b7-a828411240ec_0 in pod sandbox d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891 from index: no such id: '9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7'" containerID="9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7" Apr 16 15:02:10.654182 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:02:10.654127 2574 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_kserve-ci-e2e-test_bc429169-eec1-4d83-a3b7-a828411240ec_0 in pod sandbox d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891 from index: no such id: '9e41cc6f20146701498e028c56610ab8ab3fd055dd74f91221a67534142e0aa7'; Skipping pod \"isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_kserve-ci-e2e-test(bc429169-eec1-4d83-a3b7-a828411240ec)\"" logger="UnhandledError" Apr 16 15:02:10.655449 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:02:10.655429 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_kserve-ci-e2e-test(bc429169-eec1-4d83-a3b7-a828411240ec)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" Apr 16 15:02:11.647280 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:11.647254 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/1.log" Apr 16 15:02:12.689140 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.689097 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:12.820093 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.820069 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/1.log" Apr 16 15:02:12.820234 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.820133 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:12.869065 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.869033 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location\") pod \"bc429169-eec1-4d83-a3b7-a828411240ec\" (UID: \"bc429169-eec1-4d83-a3b7-a828411240ec\") " Apr 16 15:02:12.869336 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.869317 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc429169-eec1-4d83-a3b7-a828411240ec" (UID: "bc429169-eec1-4d83-a3b7-a828411240ec"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:02:12.970510 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:12.970433 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc429169-eec1-4d83-a3b7-a828411240ec-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:02:13.654830 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.654802 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj_bc429169-eec1-4d83-a3b7-a828411240ec/storage-initializer/1.log" Apr 16 15:02:13.655007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.654886 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" event={"ID":"bc429169-eec1-4d83-a3b7-a828411240ec","Type":"ContainerDied","Data":"d3b2743c993cc59c6215846f9add4f28b3809f340d254cea406090ea976e5891"} Apr 16 15:02:13.655007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.654913 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj" Apr 16 15:02:13.655007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.654921 2574 scope.go:117] "RemoveContainer" containerID="f561c1c8f6c535d9c948d942ba8c68dc5adf72d7cd74339f3539ed8e2792fa87" Apr 16 15:02:13.681371 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.681337 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:13.688607 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.686914 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-55bbb477c8-lx7dj"] Apr 16 15:02:13.795322 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795284 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:02:13.795706 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795680 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.795706 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795694 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795708 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795714 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795724 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795729 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795744 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="storage-initializer" Apr 16 15:02:13.795778 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795752 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="storage-initializer" Apr 16 15:02:13.795961 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795797 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.795961 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795807 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="878c76bb-d6a6-427e-92d1-8156bee36488" containerName="kserve-container" Apr 16 15:02:13.795961 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.795895 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" containerName="storage-initializer" Apr 16 15:02:13.799928 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.799909 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.802122 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.802097 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 15:02:13.802122 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.802108 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdtpg\"" Apr 16 15:02:13.802301 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.802185 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 15:02:13.808185 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.808160 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:02:13.879041 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.879007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.879041 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.879047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.979468 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.979381 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.979468 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.979423 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.979804 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.979783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:13.980025 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:13.980010 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:14.110502 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:14.110465 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:14.235389 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:14.235363 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:02:14.238010 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:02:14.237982 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod875adb15_a189_498c_baf0_4fea06028a24.slice/crio-0f00703c5c77d1363bf5d05c07ad69c77ac5f9afe5e4b6fe6adfba3c9d75549b WatchSource:0}: Error finding container 0f00703c5c77d1363bf5d05c07ad69c77ac5f9afe5e4b6fe6adfba3c9d75549b: Status 404 returned error can't find the container with id 0f00703c5c77d1363bf5d05c07ad69c77ac5f9afe5e4b6fe6adfba3c9d75549b Apr 16 15:02:14.660366 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:14.660328 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerStarted","Data":"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728"} Apr 16 15:02:14.660366 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:14.660365 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerStarted","Data":"0f00703c5c77d1363bf5d05c07ad69c77ac5f9afe5e4b6fe6adfba3c9d75549b"} Apr 16 15:02:15.187312 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:15.187279 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc429169-eec1-4d83-a3b7-a828411240ec" path="/var/lib/kubelet/pods/bc429169-eec1-4d83-a3b7-a828411240ec/volumes" Apr 16 15:02:15.667566 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:15.667533 2574 generic.go:358] "Generic (PLEG): container finished" podID="875adb15-a189-498c-baf0-4fea06028a24" containerID="d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728" exitCode=0 Apr 16 15:02:15.667773 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:15.667618 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerDied","Data":"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728"} Apr 16 15:02:16.672729 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:16.672696 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerStarted","Data":"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150"} Apr 16 15:02:16.673125 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:16.672903 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:02:16.674350 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:16.674321 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:02:16.690187 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:16.690140 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podStartSLOduration=3.6901242610000002 podStartE2EDuration="3.690124261s" podCreationTimestamp="2026-04-16 15:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:02:16.68816878 +0000 UTC m=+3778.049711548" watchObservedRunningTime="2026-04-16 15:02:16.690124261 +0000 UTC m=+3778.051667029" Apr 16 15:02:17.676449 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:17.676411 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:02:27.676450 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:27.676403 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:02:37.676674 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:37.676628 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:02:47.676454 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:47.676405 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:02:57.676815 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:02:57.676768 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:03:07.677347 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:07.677299 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:03:17.677189 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:17.677140 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:03:27.678385 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:27.678351 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:03:33.810300 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:33.810257 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:03:33.810718 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:33.810537 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" containerID="cri-o://81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150" gracePeriod=30 Apr 16 15:03:34.893745 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:34.893710 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:34.896942 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:34.896923 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:34.906245 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:34.906220 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:35.032527 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.032480 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd\" (UID: \"22875145-8ec3-410a-83ef-9739d328f1c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:35.133886 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.133836 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd\" (UID: \"22875145-8ec3-410a-83ef-9739d328f1c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:35.134262 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.134237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd\" (UID: \"22875145-8ec3-410a-83ef-9739d328f1c8\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:35.208886 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.208779 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:35.340323 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.340286 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:35.344387 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:03:35.344353 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22875145_8ec3_410a_83ef_9739d328f1c8.slice/crio-f639825ebcad9267eaa9b1da8df30ee14c6a488532d58aa0bccacf54772bc7bc WatchSource:0}: Error finding container f639825ebcad9267eaa9b1da8df30ee14c6a488532d58aa0bccacf54772bc7bc: Status 404 returned error can't find the container with id f639825ebcad9267eaa9b1da8df30ee14c6a488532d58aa0bccacf54772bc7bc Apr 16 15:03:35.906443 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.906397 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerStarted","Data":"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5"} Apr 16 15:03:35.906443 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:35.906448 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerStarted","Data":"f639825ebcad9267eaa9b1da8df30ee14c6a488532d58aa0bccacf54772bc7bc"} Apr 16 15:03:37.677177 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:37.677083 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" probeResult="failure" output="dial tcp 10.133.0.70:8080: connect: connection refused" Apr 16 15:03:38.352326 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.352302 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:03:38.465139 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.465052 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location\") pod \"875adb15-a189-498c-baf0-4fea06028a24\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " Apr 16 15:03:38.465139 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.465097 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert\") pod \"875adb15-a189-498c-baf0-4fea06028a24\" (UID: \"875adb15-a189-498c-baf0-4fea06028a24\") " Apr 16 15:03:38.465378 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.465355 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "875adb15-a189-498c-baf0-4fea06028a24" (UID: "875adb15-a189-498c-baf0-4fea06028a24"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:38.465452 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.465433 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "875adb15-a189-498c-baf0-4fea06028a24" (UID: "875adb15-a189-498c-baf0-4fea06028a24"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 15:03:38.565905 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.565867 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/875adb15-a189-498c-baf0-4fea06028a24-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:03:38.565905 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.565900 2574 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/875adb15-a189-498c-baf0-4fea06028a24-cabundle-cert\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:03:38.916737 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.916704 2574 generic.go:358] "Generic (PLEG): container finished" podID="875adb15-a189-498c-baf0-4fea06028a24" containerID="81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150" exitCode=0 Apr 16 15:03:38.917133 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.916781 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" Apr 16 15:03:38.917133 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.916788 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerDied","Data":"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150"} Apr 16 15:03:38.917133 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.916826 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf" event={"ID":"875adb15-a189-498c-baf0-4fea06028a24","Type":"ContainerDied","Data":"0f00703c5c77d1363bf5d05c07ad69c77ac5f9afe5e4b6fe6adfba3c9d75549b"} Apr 16 15:03:38.917133 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.916846 2574 scope.go:117] "RemoveContainer" containerID="81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150" Apr 16 15:03:38.925219 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.925198 2574 scope.go:117] "RemoveContainer" containerID="d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728" Apr 16 15:03:38.932278 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.932261 2574 scope.go:117] "RemoveContainer" containerID="81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150" Apr 16 15:03:38.932523 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:03:38.932504 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150\": container with ID starting with 81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150 not found: ID does not exist" containerID="81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150" Apr 16 15:03:38.932591 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.932532 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150"} err="failed to get container status \"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150\": rpc error: code = NotFound desc = could not find container \"81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150\": container with ID starting with 81aa65f0226e763a74a6bcc8382ab5417429898e7945a6cec907b1ce35ee4150 not found: ID does not exist" Apr 16 15:03:38.932591 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.932564 2574 scope.go:117] "RemoveContainer" containerID="d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728" Apr 16 15:03:38.932842 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:03:38.932823 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728\": container with ID starting with d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728 not found: ID does not exist" containerID="d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728" Apr 16 15:03:38.932903 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.932847 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728"} err="failed to get container status \"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728\": rpc error: code = NotFound desc = could not find container \"d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728\": container with ID starting with d1497edd8555b48e3014e4e1bba2e0eecfafa69f582ead31af21a20c8eb5d728 not found: ID does not exist" Apr 16 15:03:38.936891 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.936868 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:03:38.938620 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:38.938597 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-85c77d64fc-t6gvf"] Apr 16 15:03:39.187079 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:39.187001 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="875adb15-a189-498c-baf0-4fea06028a24" path="/var/lib/kubelet/pods/875adb15-a189-498c-baf0-4fea06028a24/volumes" Apr 16 15:03:40.925743 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:40.925662 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/0.log" Apr 16 15:03:40.925743 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:40.925704 2574 generic.go:358] "Generic (PLEG): container finished" podID="22875145-8ec3-410a-83ef-9739d328f1c8" containerID="458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5" exitCode=1 Apr 16 15:03:40.926263 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:40.925741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerDied","Data":"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5"} Apr 16 15:03:41.930685 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:41.930655 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/0.log" Apr 16 15:03:41.931065 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:41.930738 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerStarted","Data":"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d"} Apr 16 15:03:44.899698 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:44.899659 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:44.900108 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:44.899894 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" containerID="cri-o://c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d" gracePeriod=30 Apr 16 15:03:45.334818 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.334794 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/1.log" Apr 16 15:03:45.335188 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.335170 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/0.log" Apr 16 15:03:45.335277 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.335240 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:45.423944 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.423903 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location\") pod \"22875145-8ec3-410a-83ef-9739d328f1c8\" (UID: \"22875145-8ec3-410a-83ef-9739d328f1c8\") " Apr 16 15:03:45.424175 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.424152 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "22875145-8ec3-410a-83ef-9739d328f1c8" (UID: "22875145-8ec3-410a-83ef-9739d328f1c8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:03:45.525253 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.525170 2574 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/22875145-8ec3-410a-83ef-9739d328f1c8-kserve-provision-location\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:03:45.943715 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.943681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/1.log" Apr 16 15:03:45.944104 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944061 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd_22875145-8ec3-410a-83ef-9739d328f1c8/storage-initializer/0.log" Apr 16 15:03:45.944104 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944096 2574 generic.go:358] "Generic (PLEG): container finished" podID="22875145-8ec3-410a-83ef-9739d328f1c8" containerID="c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d" exitCode=1 Apr 16 15:03:45.944180 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944125 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerDied","Data":"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d"} Apr 16 15:03:45.944180 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944153 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" event={"ID":"22875145-8ec3-410a-83ef-9739d328f1c8","Type":"ContainerDied","Data":"f639825ebcad9267eaa9b1da8df30ee14c6a488532d58aa0bccacf54772bc7bc"} Apr 16 15:03:45.944180 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944173 2574 scope.go:117] "RemoveContainer" containerID="c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d" Apr 16 15:03:45.944284 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.944174 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd" Apr 16 15:03:45.951916 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.951896 2574 scope.go:117] "RemoveContainer" containerID="458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5" Apr 16 15:03:45.958943 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.958924 2574 scope.go:117] "RemoveContainer" containerID="c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d" Apr 16 15:03:45.959168 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:03:45.959148 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d\": container with ID starting with c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d not found: ID does not exist" containerID="c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d" Apr 16 15:03:45.959214 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.959177 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d"} err="failed to get container status \"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d\": rpc error: code = NotFound desc = could not find container \"c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d\": container with ID starting with c621b5fe373c1ae26ae4f1dac07db8d3fbea5800585e372e02047e2d9852f34d not found: ID does not exist" Apr 16 15:03:45.959214 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.959192 2574 scope.go:117] "RemoveContainer" containerID="458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5" Apr 16 15:03:45.959399 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:03:45.959381 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5\": container with ID starting with 458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5 not found: ID does not exist" containerID="458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5" Apr 16 15:03:45.959440 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.959408 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5"} err="failed to get container status \"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5\": rpc error: code = NotFound desc = could not find container \"458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5\": container with ID starting with 458f8c6acb823dd4bc04bd8cfcf62e1366b7222725c97d30401fe43df82d1dc5 not found: ID does not exist" Apr 16 15:03:45.977508 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.977471 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:45.980398 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:45.980369 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-698cb7f679-dccpd"] Apr 16 15:03:47.187853 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.187823 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" path="/var/lib/kubelet/pods/22875145-8ec3-410a-83ef-9739d328f1c8/volumes" Apr 16 15:03:47.243518 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243489 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-m7mbk/must-gather-xxw27"] Apr 16 15:03:47.243828 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243814 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" Apr 16 15:03:47.243874 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243830 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" Apr 16 15:03:47.243874 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243855 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="storage-initializer" Apr 16 15:03:47.243874 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243863 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="storage-initializer" Apr 16 15:03:47.243986 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243874 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.243986 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243882 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.243986 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243953 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="875adb15-a189-498c-baf0-4fea06028a24" containerName="kserve-container" Apr 16 15:03:47.243986 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.243966 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.244132 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.244023 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.244132 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.244029 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.244132 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.244078 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="22875145-8ec3-410a-83ef-9739d328f1c8" containerName="storage-initializer" Apr 16 15:03:47.248138 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.248117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.252030 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.252009 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m7mbk\"/\"kube-root-ca.crt\"" Apr 16 15:03:47.252030 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.252030 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-m7mbk\"/\"openshift-service-ca.crt\"" Apr 16 15:03:47.270233 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.270206 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m7mbk/must-gather-xxw27"] Apr 16 15:03:47.340199 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.340159 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctkcn\" (UniqueName: \"kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.340391 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.340220 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.440742 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.440644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.440742 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.440727 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctkcn\" (UniqueName: \"kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.441007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.440986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.456108 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.456079 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctkcn\" (UniqueName: \"kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn\") pod \"must-gather-xxw27\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.571877 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.571840 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:03:47.703345 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.703267 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-m7mbk/must-gather-xxw27"] Apr 16 15:03:47.706810 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:03:47.706778 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod691afa6d_735e_4a62_bd4a_4fd34400b643.slice/crio-120941a63c5af8331e2651783fd3101bc6462bd3423594ce3c7485221c7e10ff WatchSource:0}: Error finding container 120941a63c5af8331e2651783fd3101bc6462bd3423594ce3c7485221c7e10ff: Status 404 returned error can't find the container with id 120941a63c5af8331e2651783fd3101bc6462bd3423594ce3c7485221c7e10ff Apr 16 15:03:47.951056 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:47.951015 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7mbk/must-gather-xxw27" event={"ID":"691afa6d-735e-4a62-bd4a-4fd34400b643","Type":"ContainerStarted","Data":"120941a63c5af8331e2651783fd3101bc6462bd3423594ce3c7485221c7e10ff"} Apr 16 15:03:52.968400 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:52.968356 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7mbk/must-gather-xxw27" event={"ID":"691afa6d-735e-4a62-bd4a-4fd34400b643","Type":"ContainerStarted","Data":"3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36"} Apr 16 15:03:52.968400 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:52.968408 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7mbk/must-gather-xxw27" event={"ID":"691afa6d-735e-4a62-bd4a-4fd34400b643","Type":"ContainerStarted","Data":"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2"} Apr 16 15:03:52.983718 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:03:52.983666 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-m7mbk/must-gather-xxw27" podStartSLOduration=1.750830517 podStartE2EDuration="5.9836472s" podCreationTimestamp="2026-04-16 15:03:47 +0000 UTC" firstStartedPulling="2026-04-16 15:03:47.708826196 +0000 UTC m=+3869.070368942" lastFinishedPulling="2026-04-16 15:03:51.941642869 +0000 UTC m=+3873.303185625" observedRunningTime="2026-04-16 15:03:52.982373742 +0000 UTC m=+3874.343916510" watchObservedRunningTime="2026-04-16 15:03:52.9836472 +0000 UTC m=+3874.345189993" Apr 16 15:04:13.039414 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:13.039370 2574 generic.go:358] "Generic (PLEG): container finished" podID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerID="bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2" exitCode=0 Apr 16 15:04:13.039927 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:13.039439 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-m7mbk/must-gather-xxw27" event={"ID":"691afa6d-735e-4a62-bd4a-4fd34400b643","Type":"ContainerDied","Data":"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2"} Apr 16 15:04:13.039927 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:13.039803 2574 scope.go:117] "RemoveContainer" containerID="bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2" Apr 16 15:04:13.995190 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:13.995159 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7mbk_must-gather-xxw27_691afa6d-735e-4a62-bd4a-4fd34400b643/gather/0.log" Apr 16 15:04:14.680007 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.679974 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g9h4j/must-gather-jwdr2"] Apr 16 15:04:14.682092 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.682075 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.684235 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.684216 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"kube-root-ca.crt\"" Apr 16 15:04:14.684490 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.684470 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-g9h4j\"/\"openshift-service-ca.crt\"" Apr 16 15:04:14.684663 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.684645 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-g9h4j\"/\"default-dockercfg-df925\"" Apr 16 15:04:14.694160 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.694133 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/must-gather-jwdr2"] Apr 16 15:04:14.769761 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.769725 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3343786-b369-445c-81a0-0033286db398-must-gather-output\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.769953 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.769790 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkztw\" (UniqueName: \"kubernetes.io/projected/c3343786-b369-445c-81a0-0033286db398-kube-api-access-tkztw\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.870610 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.870548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3343786-b369-445c-81a0-0033286db398-must-gather-output\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.870788 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.870647 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkztw\" (UniqueName: \"kubernetes.io/projected/c3343786-b369-445c-81a0-0033286db398-kube-api-access-tkztw\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.870909 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.870889 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/c3343786-b369-445c-81a0-0033286db398-must-gather-output\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.884639 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.884605 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkztw\" (UniqueName: \"kubernetes.io/projected/c3343786-b369-445c-81a0-0033286db398-kube-api-access-tkztw\") pod \"must-gather-jwdr2\" (UID: \"c3343786-b369-445c-81a0-0033286db398\") " pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:14.991528 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:14.991422 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" Apr 16 15:04:15.113506 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:15.113482 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/must-gather-jwdr2"] Apr 16 15:04:15.116102 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:04:15.116072 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3343786_b369_445c_81a0_0033286db398.slice/crio-57b030cb31a2be072e93a6f007e3925b6ee6445a5af7ff2a1a417dbc9e22acf9 WatchSource:0}: Error finding container 57b030cb31a2be072e93a6f007e3925b6ee6445a5af7ff2a1a417dbc9e22acf9: Status 404 returned error can't find the container with id 57b030cb31a2be072e93a6f007e3925b6ee6445a5af7ff2a1a417dbc9e22acf9 Apr 16 15:04:16.050974 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:16.050932 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" event={"ID":"c3343786-b369-445c-81a0-0033286db398","Type":"ContainerStarted","Data":"57b030cb31a2be072e93a6f007e3925b6ee6445a5af7ff2a1a417dbc9e22acf9"} Apr 16 15:04:17.055972 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.055920 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" event={"ID":"c3343786-b369-445c-81a0-0033286db398","Type":"ContainerStarted","Data":"6d3edc2a706c18d5b0b485d82f1cf6677afbdf89380a834b4cce52d1316914ae"} Apr 16 15:04:17.056439 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.055981 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" event={"ID":"c3343786-b369-445c-81a0-0033286db398","Type":"ContainerStarted","Data":"9cae52eea85e1886d21030469c16ea9647051d18171c92d690c8d0a5a0c3d960"} Apr 16 15:04:17.079767 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.079706 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g9h4j/must-gather-jwdr2" podStartSLOduration=2.183538276 podStartE2EDuration="3.079685952s" podCreationTimestamp="2026-04-16 15:04:14 +0000 UTC" firstStartedPulling="2026-04-16 15:04:15.117944372 +0000 UTC m=+3896.479487117" lastFinishedPulling="2026-04-16 15:04:16.014092043 +0000 UTC m=+3897.375634793" observedRunningTime="2026-04-16 15:04:17.077866758 +0000 UTC m=+3898.439409529" watchObservedRunningTime="2026-04-16 15:04:17.079685952 +0000 UTC m=+3898.441228720" Apr 16 15:04:17.723627 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.723565 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-snqxc_2c1c9f52-2fd5-4d89-83a2-c7f03121a1f9/global-pull-secret-syncer/0.log" Apr 16 15:04:17.851195 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.851162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-l76kn_0f53d27c-7378-4e8b-8dfd-f39beb70f859/konnectivity-agent/0.log" Apr 16 15:04:17.897782 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:17.897754 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-139-151.ec2.internal_c5729c9ff098ff2004acf3ccde20c30f/haproxy/0.log" Apr 16 15:04:19.510404 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.510361 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-m7mbk/must-gather-xxw27"] Apr 16 15:04:19.510955 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.510694 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-m7mbk/must-gather-xxw27" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="copy" containerID="cri-o://3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36" gracePeriod=2 Apr 16 15:04:19.513422 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.513395 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-m7mbk/must-gather-xxw27"] Apr 16 15:04:19.513670 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.513643 2574 status_manager.go:895] "Failed to get status for pod" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" pod="openshift-must-gather-m7mbk/must-gather-xxw27" err="pods \"must-gather-xxw27\" is forbidden: User \"system:node:ip-10-0-139-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7mbk\": no relationship found between node 'ip-10-0-139-151.ec2.internal' and this object" Apr 16 15:04:19.888669 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.882917 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7mbk_must-gather-xxw27_691afa6d-735e-4a62-bd4a-4fd34400b643/copy/0.log" Apr 16 15:04:19.888669 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.888014 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:04:19.894627 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:19.890956 2574 status_manager.go:895] "Failed to get status for pod" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" pod="openshift-must-gather-m7mbk/must-gather-xxw27" err="pods \"must-gather-xxw27\" is forbidden: User \"system:node:ip-10-0-139-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7mbk\": no relationship found between node 'ip-10-0-139-151.ec2.internal' and this object" Apr 16 15:04:20.030719 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.030678 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output\") pod \"691afa6d-735e-4a62-bd4a-4fd34400b643\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " Apr 16 15:04:20.031050 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.031030 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctkcn\" (UniqueName: \"kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn\") pod \"691afa6d-735e-4a62-bd4a-4fd34400b643\" (UID: \"691afa6d-735e-4a62-bd4a-4fd34400b643\") " Apr 16 15:04:20.032642 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.032598 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "691afa6d-735e-4a62-bd4a-4fd34400b643" (UID: "691afa6d-735e-4a62-bd4a-4fd34400b643"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 15:04:20.034012 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.033973 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn" (OuterVolumeSpecName: "kube-api-access-ctkcn") pod "691afa6d-735e-4a62-bd4a-4fd34400b643" (UID: "691afa6d-735e-4a62-bd4a-4fd34400b643"). InnerVolumeSpecName "kube-api-access-ctkcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 15:04:20.070865 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.070817 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-m7mbk_must-gather-xxw27_691afa6d-735e-4a62-bd4a-4fd34400b643/copy/0.log" Apr 16 15:04:20.071458 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.071431 2574 generic.go:358] "Generic (PLEG): container finished" podID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerID="3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36" exitCode=143 Apr 16 15:04:20.071601 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.071520 2574 scope.go:117] "RemoveContainer" containerID="3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36" Apr 16 15:04:20.071731 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.071715 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-m7mbk/must-gather-xxw27" Apr 16 15:04:20.073903 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.073871 2574 status_manager.go:895] "Failed to get status for pod" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" pod="openshift-must-gather-m7mbk/must-gather-xxw27" err="pods \"must-gather-xxw27\" is forbidden: User \"system:node:ip-10-0-139-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7mbk\": no relationship found between node 'ip-10-0-139-151.ec2.internal' and this object" Apr 16 15:04:20.090010 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.089969 2574 status_manager.go:895] "Failed to get status for pod" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" pod="openshift-must-gather-m7mbk/must-gather-xxw27" err="pods \"must-gather-xxw27\" is forbidden: User \"system:node:ip-10-0-139-151.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-m7mbk\": no relationship found between node 'ip-10-0-139-151.ec2.internal' and this object" Apr 16 15:04:20.090546 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.090398 2574 scope.go:117] "RemoveContainer" containerID="bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2" Apr 16 15:04:20.115128 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.115098 2574 scope.go:117] "RemoveContainer" containerID="3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36" Apr 16 15:04:20.116251 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:04:20.115927 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36\": container with ID starting with 3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36 not found: ID does not exist" containerID="3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36" Apr 16 15:04:20.116251 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.116007 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36"} err="failed to get container status \"3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36\": rpc error: code = NotFound desc = could not find container \"3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36\": container with ID starting with 3bf77d5abdf110c5de12af21815bfa8c782e06866ebfd6a93184f55e8e1c5d36 not found: ID does not exist" Apr 16 15:04:20.116251 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.116037 2574 scope.go:117] "RemoveContainer" containerID="bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2" Apr 16 15:04:20.116976 ip-10-0-139-151 kubenswrapper[2574]: E0416 15:04:20.116885 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2\": container with ID starting with bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2 not found: ID does not exist" containerID="bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2" Apr 16 15:04:20.116976 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.116916 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2"} err="failed to get container status \"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2\": rpc error: code = NotFound desc = could not find container \"bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2\": container with ID starting with bde795c1a654b9480eaa89f718cdf83a159fc8181837dbdc1d81f24464d7a0b2 not found: ID does not exist" Apr 16 15:04:20.132793 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.132726 2574 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/691afa6d-735e-4a62-bd4a-4fd34400b643-must-gather-output\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:04:20.132793 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:20.132764 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ctkcn\" (UniqueName: \"kubernetes.io/projected/691afa6d-735e-4a62-bd4a-4fd34400b643-kube-api-access-ctkcn\") on node \"ip-10-0-139-151.ec2.internal\" DevicePath \"\"" Apr 16 15:04:21.189252 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:21.189211 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" path="/var/lib/kubelet/pods/691afa6d-735e-4a62-bd4a-4fd34400b643/volumes" Apr 16 15:04:22.131883 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.131850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/node-exporter/0.log" Apr 16 15:04:22.153380 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.153348 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/kube-rbac-proxy/0.log" Apr 16 15:04:22.174515 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.174487 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gwdbk_bbcf63e7-182a-4aaf-a012-585f17a5a74a/init-textfile/0.log" Apr 16 15:04:22.306926 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.306869 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/prometheus/0.log" Apr 16 15:04:22.325972 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.325940 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/config-reloader/0.log" Apr 16 15:04:22.348945 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.348874 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/thanos-sidecar/0.log" Apr 16 15:04:22.369798 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.369768 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/kube-rbac-proxy-web/0.log" Apr 16 15:04:22.393462 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.393396 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/kube-rbac-proxy/0.log" Apr 16 15:04:22.414398 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.414364 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/kube-rbac-proxy-thanos/0.log" Apr 16 15:04:22.437054 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.437019 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_fbe94934-f8e3-473a-afe1-8ecf9efd63b3/init-config-reloader/0.log" Apr 16 15:04:22.467624 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.467572 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4bf7x_d84ae69b-8b20-4dc2-bb8b-46889ca16d3e/prometheus-operator/0.log" Apr 16 15:04:22.498316 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:22.498243 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-4bf7x_d84ae69b-8b20-4dc2-bb8b-46889ca16d3e/kube-rbac-proxy/0.log" Apr 16 15:04:25.100559 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.100522 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-vjw2w_2310273c-d482-4d05-b71f-6db31c8a2fe1/volume-data-source-validator/0.log" Apr 16 15:04:25.105660 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.105628 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54"] Apr 16 15:04:25.106047 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106030 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="gather" Apr 16 15:04:25.106116 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106050 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="gather" Apr 16 15:04:25.106116 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106059 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="copy" Apr 16 15:04:25.106116 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106067 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="copy" Apr 16 15:04:25.106229 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106140 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="gather" Apr 16 15:04:25.106229 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.106152 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="691afa6d-735e-4a62-bd4a-4fd34400b643" containerName="copy" Apr 16 15:04:25.109883 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.109863 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.117370 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.117338 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54"] Apr 16 15:04:25.178632 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.178519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-sys\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.178838 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.178651 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-lib-modules\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.178838 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.178720 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-podres\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.178838 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.178789 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-proc\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.179033 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.178862 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4dg4\" (UniqueName: \"kubernetes.io/projected/88ba954e-65e9-4381-b0b6-9068c97db984-kube-api-access-f4dg4\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280191 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280155 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-sys\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280191 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-lib-modules\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280248 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-podres\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280286 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-sys\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-lib-modules\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280335 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-proc\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280390 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4dg4\" (UniqueName: \"kubernetes.io/projected/88ba954e-65e9-4381-b0b6-9068c97db984-kube-api-access-f4dg4\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280408 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-proc\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.280457 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.280388 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88ba954e-65e9-4381-b0b6-9068c97db984-podres\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.288836 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.288800 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4dg4\" (UniqueName: \"kubernetes.io/projected/88ba954e-65e9-4381-b0b6-9068c97db984-kube-api-access-f4dg4\") pod \"perf-node-gather-daemonset-57n54\" (UID: \"88ba954e-65e9-4381-b0b6-9068c97db984\") " pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.422191 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.422086 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:25.568206 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.568174 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54"] Apr 16 15:04:25.571528 ip-10-0-139-151 kubenswrapper[2574]: W0416 15:04:25.571497 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod88ba954e_65e9_4381_b0b6_9068c97db984.slice/crio-90a94323640752e93af384bc6df7e557b54d44b3b2a1561e37a37826efb74a13 WatchSource:0}: Error finding container 90a94323640752e93af384bc6df7e557b54d44b3b2a1561e37a37826efb74a13: Status 404 returned error can't find the container with id 90a94323640752e93af384bc6df7e557b54d44b3b2a1561e37a37826efb74a13 Apr 16 15:04:25.573049 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.573033 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 15:04:25.780911 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.780813 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-84w8w_defde0b7-6e87-43ef-ad17-0ce3ffc5d902/dns/0.log" Apr 16 15:04:25.800710 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.800682 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-84w8w_defde0b7-6e87-43ef-ad17-0ce3ffc5d902/kube-rbac-proxy/0.log" Apr 16 15:04:25.911604 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:25.911551 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-65gv2_aa030d80-2a63-4669-acb7-9485b1b8ce4a/dns-node-resolver/0.log" Apr 16 15:04:26.092528 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:26.092491 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" event={"ID":"88ba954e-65e9-4381-b0b6-9068c97db984","Type":"ContainerStarted","Data":"6159495cd607ebb843b5d36645eeb521bda8806c1451a58223ab2db8b941b0d9"} Apr 16 15:04:26.092528 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:26.092529 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" event={"ID":"88ba954e-65e9-4381-b0b6-9068c97db984","Type":"ContainerStarted","Data":"90a94323640752e93af384bc6df7e557b54d44b3b2a1561e37a37826efb74a13"} Apr 16 15:04:26.092783 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:26.092712 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:26.106793 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:26.106741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" podStartSLOduration=1.106725817 podStartE2EDuration="1.106725817s" podCreationTimestamp="2026-04-16 15:04:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 15:04:26.106178653 +0000 UTC m=+3907.467721433" watchObservedRunningTime="2026-04-16 15:04:26.106725817 +0000 UTC m=+3907.468268585" Apr 16 15:04:26.485510 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:26.485385 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzrpt_567139c7-8d34-429b-bd38-0ab1aafa14e9/node-ca/0.log" Apr 16 15:04:27.604324 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:27.604298 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-2bcnf_e05dc6e0-3aae-437d-a7bd-6b5851441185/serve-healthcheck-canary/0.log" Apr 16 15:04:28.070793 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:28.070767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xpgj4_7c051fe0-3220-4517-8c4c-4c0a8bf7518d/insights-operator/0.log" Apr 16 15:04:28.072464 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:28.072442 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xpgj4_7c051fe0-3220-4517-8c4c-4c0a8bf7518d/insights-operator/1.log" Apr 16 15:04:28.093311 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:28.093284 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2th59_27a36c64-4ff1-437f-8dc4-ca6ff5387bfd/kube-rbac-proxy/0.log" Apr 16 15:04:28.116833 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:28.116803 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2th59_27a36c64-4ff1-437f-8dc4-ca6ff5387bfd/exporter/0.log" Apr 16 15:04:28.156403 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:28.156355 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-2th59_27a36c64-4ff1-437f-8dc4-ca6ff5387bfd/extractor/0.log" Apr 16 15:04:30.465037 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.465005 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-crbmt_fa946f5a-8221-4ddb-818c-316b3ef0afa2/server/0.log" Apr 16 15:04:30.712882 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.712839 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-md2fm_86603922-7ef5-4db6-bca9-3a12f847824c/manager/0.log" Apr 16 15:04:30.733622 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.733508 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-62m6k_bcfb22cf-a086-4275-a00d-092421ff7fc7/s3-init/0.log" Apr 16 15:04:30.757246 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.757219 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-2qngm_5e0fc195-7fd2-4efd-be4a-4603e88e634d/s3-tls-init-custom/0.log" Apr 16 15:04:30.780299 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.780267 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-799wj_6eee4fd4-65cd-4ea6-a004-5bc8ff84ba06/s3-tls-init-serving/0.log" Apr 16 15:04:30.814149 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.814119 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-nmgjf_2fc61e58-081f-401b-85c4-70fdec24c552/seaweedfs/0.log" Apr 16 15:04:30.841898 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.841871 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-custom-5c88b85bb7-8vc7n_37a7a911-e94b-438e-ad0f-8c3a2988996d/seaweedfs-tls-custom/0.log" Apr 16 15:04:30.863647 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:30.863620 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-tls-serving-7fd5766db9-w9jmf_09ee73fc-a005-4eb2-833c-86a8deb2d48d/seaweedfs-tls-serving/0.log" Apr 16 15:04:32.106231 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:32.106200 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-g9h4j/perf-node-gather-daemonset-57n54" Apr 16 15:04:35.087861 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:35.087824 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-jm8nq_956aabe3-df2b-45a4-bf9b-66468aff27cd/migrator/0.log" Apr 16 15:04:35.113052 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:35.112903 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-jm8nq_956aabe3-df2b-45a4-bf9b-66468aff27cd/graceful-termination/0.log" Apr 16 15:04:36.468230 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.468193 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-44586_edb2f95f-5ada-45b4-862b-187eab79d4a5/kube-multus/0.log" Apr 16 15:04:36.491188 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.491161 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/kube-multus-additional-cni-plugins/0.log" Apr 16 15:04:36.514503 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.514476 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/egress-router-binary-copy/0.log" Apr 16 15:04:36.536546 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.536514 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/cni-plugins/0.log" Apr 16 15:04:36.559146 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.559111 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/bond-cni-plugin/0.log" Apr 16 15:04:36.582621 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.582595 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/routeoverride-cni/0.log" Apr 16 15:04:36.604721 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.604696 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/whereabouts-cni-bincopy/0.log" Apr 16 15:04:36.626747 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:36.626715 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-c2sdt_5596cb4f-7692-4c74-82c7-87e46bdfd720/whereabouts-cni/0.log" Apr 16 15:04:37.069747 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:37.069673 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fbnhb_3d0ab572-848b-495c-afdf-ad744ea2b230/network-metrics-daemon/0.log" Apr 16 15:04:37.091037 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:37.091011 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fbnhb_3d0ab572-848b-495c-afdf-ad744ea2b230/kube-rbac-proxy/0.log" Apr 16 15:04:37.943744 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:37.943710 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-controller/0.log" Apr 16 15:04:37.960244 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:37.960218 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log" Apr 16 15:04:37.982006 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:37.981972 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/1.log" Apr 16 15:04:38.000562 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.000534 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/kube-rbac-proxy-node/0.log" Apr 16 15:04:38.020760 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.020736 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 15:04:38.040834 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.040803 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/northd/0.log" Apr 16 15:04:38.061070 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.061041 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/nbdb/0.log" Apr 16 15:04:38.084395 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.084370 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/sbdb/0.log" Apr 16 15:04:38.209997 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:38.209910 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovnkube-controller/0.log" Apr 16 15:04:39.943441 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:39.943412 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-vl5lw_5b2a2f05-9a25-4652-ba72-816977b324b5/network-check-target-container/0.log" Apr 16 15:04:40.902611 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:40.902557 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-zzzpm_2febdb01-c922-4ac4-81e6-2b92df909f85/iptables-alerter/0.log" Apr 16 15:04:41.632230 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:41.632203 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-sfmlv_c0077ff0-36ff-4fe1-bc19-c63239f74a39/tuned/0.log" Apr 16 15:04:42.337995 ip-10-0-139-151 kubenswrapper[2574]: I0416 15:04:42.337967 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-l6b5l_467df738-bd26-4dba-b771-01c7f6844b70/ovn-acl-logging/0.log"