Apr 16 18:17:03.780256 ip-10-0-136-226 systemd[1]: Starting Kubernetes Kubelet... Apr 16 18:17:04.167841 ip-10-0-136-226 kubenswrapper[2566]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:04.167841 ip-10-0-136-226 kubenswrapper[2566]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 18:17:04.167841 ip-10-0-136-226 kubenswrapper[2566]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:04.167841 ip-10-0-136-226 kubenswrapper[2566]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 18:17:04.167841 ip-10-0-136-226 kubenswrapper[2566]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 18:17:04.170455 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.170330 2566 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 18:17:04.175135 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175121 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:04.175135 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175135 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175139 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175143 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175146 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175150 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175152 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175155 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175159 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175161 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175166 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175169 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175173 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175176 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175180 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175184 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175186 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175189 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175192 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175194 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:04.175199 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175197 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175199 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175202 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175205 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175208 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175211 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175214 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175217 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175219 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175222 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175224 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175227 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175229 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175232 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175235 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175237 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175240 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175242 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175245 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175247 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:04.175635 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175250 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175252 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175255 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175257 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175261 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175265 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175268 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175271 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175273 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175276 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175279 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175282 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175285 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175288 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175292 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175294 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175297 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175299 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175302 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:04.176147 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175305 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175308 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175310 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175313 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175315 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175318 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175321 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175324 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175327 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175329 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175332 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175334 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175337 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175339 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175342 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175346 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175348 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175351 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175353 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175356 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:04.176605 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175359 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175362 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175364 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175367 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175370 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175373 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175377 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175777 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175783 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175787 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175790 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175793 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175795 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175798 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175801 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175803 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175806 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175808 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175811 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175814 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:04.177108 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175816 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175819 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175822 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175824 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175827 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175829 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175832 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175835 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175837 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175840 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175842 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175846 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175849 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175855 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175858 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175860 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175863 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175867 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175872 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:04.177584 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175875 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175878 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175881 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175884 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175887 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175890 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175893 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175896 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175898 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175901 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175904 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175907 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175909 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175912 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175915 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175918 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175920 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175923 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175927 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:04.178071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175931 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175934 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175937 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175939 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175942 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175944 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175947 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175949 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175952 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175955 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175957 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175961 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175963 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175966 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175969 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175971 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175974 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175976 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175979 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175981 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:04.178538 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175984 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.175986 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176002 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176005 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176007 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176011 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176013 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176016 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176019 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176022 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176024 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176027 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176030 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176032 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.176035 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177190 2566 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177199 2566 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177205 2566 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177210 2566 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177214 2566 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177217 2566 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 18:17:04.179053 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177223 2566 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177227 2566 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177231 2566 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177234 2566 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177238 2566 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177241 2566 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177244 2566 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177247 2566 flags.go:64] FLAG: --cgroup-root="" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177250 2566 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177253 2566 flags.go:64] FLAG: --client-ca-file="" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177256 2566 flags.go:64] FLAG: --cloud-config="" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177259 2566 flags.go:64] FLAG: --cloud-provider="external" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177262 2566 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177267 2566 flags.go:64] FLAG: --cluster-domain="" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177270 2566 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177273 2566 flags.go:64] FLAG: --config-dir="" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177275 2566 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177279 2566 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177283 2566 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177287 2566 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177290 2566 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177294 2566 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177297 2566 flags.go:64] FLAG: --contention-profiling="false" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177300 2566 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 18:17:04.179569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177303 2566 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177306 2566 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177309 2566 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177314 2566 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177317 2566 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177320 2566 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177323 2566 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177326 2566 flags.go:64] FLAG: --enable-server="true" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177329 2566 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177334 2566 flags.go:64] FLAG: --event-burst="100" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177337 2566 flags.go:64] FLAG: --event-qps="50" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177341 2566 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177344 2566 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177347 2566 flags.go:64] FLAG: --eviction-hard="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177351 2566 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177354 2566 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177357 2566 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177360 2566 flags.go:64] FLAG: --eviction-soft="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177363 2566 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177366 2566 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177369 2566 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177371 2566 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177374 2566 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177377 2566 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177380 2566 flags.go:64] FLAG: --feature-gates="" Apr 16 18:17:04.180161 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177384 2566 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177389 2566 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177393 2566 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177396 2566 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177399 2566 flags.go:64] FLAG: --healthz-port="10248" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177402 2566 flags.go:64] FLAG: --help="false" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177405 2566 flags.go:64] FLAG: --hostname-override="ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177409 2566 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177412 2566 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177415 2566 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177418 2566 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177421 2566 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177424 2566 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177427 2566 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177430 2566 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177433 2566 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177436 2566 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177440 2566 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177444 2566 flags.go:64] FLAG: --kube-reserved="" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177447 2566 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177450 2566 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177453 2566 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177455 2566 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177458 2566 flags.go:64] FLAG: --lock-file="" Apr 16 18:17:04.180842 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177461 2566 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177464 2566 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177467 2566 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177472 2566 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177475 2566 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177478 2566 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177481 2566 flags.go:64] FLAG: --logging-format="text" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177484 2566 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177487 2566 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177495 2566 flags.go:64] FLAG: --manifest-url="" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177498 2566 flags.go:64] FLAG: --manifest-url-header="" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177503 2566 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177506 2566 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177510 2566 flags.go:64] FLAG: --max-pods="110" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177513 2566 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177516 2566 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177519 2566 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177522 2566 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177525 2566 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177527 2566 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177530 2566 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177538 2566 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177541 2566 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177544 2566 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 18:17:04.181443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177547 2566 flags.go:64] FLAG: --pod-cidr="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177550 2566 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177556 2566 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177559 2566 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177563 2566 flags.go:64] FLAG: --pods-per-core="0" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177566 2566 flags.go:64] FLAG: --port="10250" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177569 2566 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177571 2566 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ad304b8ceb5ea22a" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177575 2566 flags.go:64] FLAG: --qos-reserved="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177578 2566 flags.go:64] FLAG: --read-only-port="10255" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177580 2566 flags.go:64] FLAG: --register-node="true" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177583 2566 flags.go:64] FLAG: --register-schedulable="true" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177586 2566 flags.go:64] FLAG: --register-with-taints="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177590 2566 flags.go:64] FLAG: --registry-burst="10" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177593 2566 flags.go:64] FLAG: --registry-qps="5" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177596 2566 flags.go:64] FLAG: --reserved-cpus="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177599 2566 flags.go:64] FLAG: --reserved-memory="" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177604 2566 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177608 2566 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177611 2566 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177614 2566 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177617 2566 flags.go:64] FLAG: --runonce="false" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177620 2566 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177623 2566 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177626 2566 flags.go:64] FLAG: --seccomp-default="false" Apr 16 18:17:04.182051 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177629 2566 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177631 2566 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177634 2566 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177637 2566 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177640 2566 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177643 2566 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177646 2566 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177649 2566 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177652 2566 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177655 2566 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177659 2566 flags.go:64] FLAG: --system-cgroups="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177662 2566 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177667 2566 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177670 2566 flags.go:64] FLAG: --tls-cert-file="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.177673 2566 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178107 2566 flags.go:64] FLAG: --tls-min-version="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178111 2566 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178114 2566 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178117 2566 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178120 2566 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178123 2566 flags.go:64] FLAG: --v="2" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178128 2566 flags.go:64] FLAG: --version="false" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178132 2566 flags.go:64] FLAG: --vmodule="" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178137 2566 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178142 2566 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 18:17:04.182641 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178239 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178242 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178246 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178248 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178251 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178254 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178257 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178260 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178262 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178265 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178267 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178270 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178272 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178275 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178277 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178280 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178282 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178285 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178288 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178290 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:04.183256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178293 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178296 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178298 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178301 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178304 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178307 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178311 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178315 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178318 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178321 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178324 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178327 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178333 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178336 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178338 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178341 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178344 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178346 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178348 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:04.183758 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178351 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178354 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178356 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178359 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178361 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178364 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178366 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178368 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178371 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178375 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178379 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178381 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178385 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178387 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178390 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178393 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178395 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178398 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178400 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178403 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:04.184278 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178406 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178408 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178411 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178414 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178416 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178420 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178423 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178426 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178428 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178431 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178434 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178436 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178439 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178441 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178444 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178447 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178449 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178452 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178454 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178457 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:04.185071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178459 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178462 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178464 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178467 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178469 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178472 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.178475 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:04.185663 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.178492 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:04.186691 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.186672 2566 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 18:17:04.186726 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.186693 2566 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 18:17:04.186753 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186744 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:04.186753 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186749 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:04.186753 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186753 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186757 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186761 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186764 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186767 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186770 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186773 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186776 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186779 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186781 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186784 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186787 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186790 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186792 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186795 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186798 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186800 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186803 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186806 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186809 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:04.186835 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186812 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186815 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186818 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186820 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186823 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186825 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186828 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186831 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186833 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186836 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186838 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186841 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186844 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186847 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186850 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186853 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186855 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186858 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186860 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186863 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:04.187347 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186865 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186868 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186871 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186873 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186876 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186878 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186881 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186883 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186886 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186889 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186891 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186894 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186896 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186899 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186903 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186906 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186908 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186911 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186914 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186917 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:04.187925 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186922 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186925 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186928 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186930 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186933 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186936 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186938 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186941 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186944 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186946 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186949 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186951 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186954 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186957 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186961 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186965 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186969 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186973 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186975 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:04.188430 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186978 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186982 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.186987 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187004 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187008 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.187013 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187111 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187117 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187121 2566 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187124 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187127 2566 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187130 2566 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187133 2566 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187136 2566 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187139 2566 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 18:17:04.188895 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187141 2566 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187144 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187148 2566 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187150 2566 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187153 2566 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187156 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187158 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187161 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187163 2566 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187165 2566 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187168 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187171 2566 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187173 2566 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187177 2566 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187179 2566 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187182 2566 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187184 2566 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187187 2566 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187190 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187192 2566 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 18:17:04.189294 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187195 2566 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187197 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187200 2566 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187202 2566 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187205 2566 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187208 2566 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187211 2566 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187213 2566 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187216 2566 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187218 2566 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187221 2566 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187224 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187226 2566 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187230 2566 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187233 2566 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187236 2566 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187239 2566 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187241 2566 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187244 2566 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 18:17:04.189776 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187246 2566 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187249 2566 feature_gate.go:328] unrecognized feature gate: Example Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187251 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187254 2566 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187257 2566 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187259 2566 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187262 2566 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187266 2566 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187268 2566 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187271 2566 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187273 2566 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187276 2566 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187279 2566 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187281 2566 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187284 2566 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187286 2566 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187289 2566 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187291 2566 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187294 2566 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187297 2566 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 18:17:04.190256 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187299 2566 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187302 2566 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187304 2566 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187307 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187309 2566 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187312 2566 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187314 2566 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187317 2566 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187319 2566 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187322 2566 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187325 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187327 2566 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187330 2566 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187332 2566 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187334 2566 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187337 2566 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187339 2566 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 18:17:04.190747 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:04.187343 2566 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 18:17:04.191173 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.187348 2566 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 18:17:04.191173 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.188029 2566 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 18:17:04.191449 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.191434 2566 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 18:17:04.192338 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.192327 2566 server.go:1019] "Starting client certificate rotation" Apr 16 18:17:04.192452 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.192435 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:04.192494 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.192469 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 18:17:04.213258 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.213239 2566 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:04.215898 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.215883 2566 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 18:17:04.227106 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.227088 2566 log.go:25] "Validated CRI v1 runtime API" Apr 16 18:17:04.231395 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.231381 2566 log.go:25] "Validated CRI v1 image API" Apr 16 18:17:04.232644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.232609 2566 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 18:17:04.236145 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.236127 2566 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 c529ec01-2360-4873-937a-6f1b60ac2ad5:/dev/nvme0n1p3 df114a97-e6a4-4ec5-8871-643d7ef4c157:/dev/nvme0n1p4] Apr 16 18:17:04.236195 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.236146 2566 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 18:17:04.242507 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.242384 2566 manager.go:217] Machine: {Timestamp:2026-04-16 18:17:04.240795896 +0000 UTC m=+0.333083853 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101590 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2e1ac727fdb112dfe5073a388a6e5d SystemUUID:ec2e1ac7-27fd-b112-dfe5-073a388a6e5d BootID:5f21bf3d-7d5a-4fd4-a408-df262f0596a4 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:bd:3d:91:24:19 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:bd:3d:91:24:19 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c2:69:9a:ba:4c:15 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 18:17:04.242507 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.242497 2566 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 18:17:04.242627 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.242571 2566 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 18:17:04.244272 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244245 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:04.244529 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244504 2566 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 18:17:04.244924 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244536 2566 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-136-226.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 18:17:04.244978 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244935 2566 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 18:17:04.244978 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244947 2566 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 18:17:04.244978 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.244961 2566 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:04.245597 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.245586 2566 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 18:17:04.247245 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.247234 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:04.247350 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.247341 2566 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 18:17:04.249139 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.249130 2566 kubelet.go:491] "Attempting to sync node with API server" Apr 16 18:17:04.249179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.249142 2566 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 18:17:04.249179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.249153 2566 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 18:17:04.249179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.249162 2566 kubelet.go:397] "Adding apiserver pod source" Apr 16 18:17:04.249179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.249170 2566 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 18:17:04.250144 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.250132 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:04.250209 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.250149 2566 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 18:17:04.252671 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.252648 2566 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 18:17:04.254288 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.254272 2566 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 18:17:04.255439 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255424 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255445 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255455 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255464 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255473 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255482 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255491 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255501 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 18:17:04.255515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255511 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 18:17:04.255760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255521 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 18:17:04.255760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255534 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 18:17:04.255760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.255548 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 18:17:04.256287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.256277 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 18:17:04.256343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.256290 2566 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 18:17:04.259686 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.259670 2566 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 18:17:04.259760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.259712 2566 server.go:1295] "Started kubelet" Apr 16 18:17:04.259820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.259791 2566 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 18:17:04.259861 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.259819 2566 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 18:17:04.259908 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.259880 2566 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 18:17:04.260417 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.260370 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-136-226.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 18:17:04.260546 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.260532 2566 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-136-226.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 18:17:04.260559 ip-10-0-136-226 systemd[1]: Started Kubernetes Kubelet. Apr 16 18:17:04.260649 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.260603 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 18:17:04.261176 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.260986 2566 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 18:17:04.262289 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.262271 2566 server.go:317] "Adding debug handlers to kubelet server" Apr 16 18:17:04.266087 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.265301 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-226.ec2.internal.18a6e91f6efeb3d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-226.ec2.internal,UID:ip-10-0-136-226.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-136-226.ec2.internal,},FirstTimestamp:2026-04-16 18:17:04.259683281 +0000 UTC m=+0.351971240,LastTimestamp:2026-04-16 18:17:04.259683281 +0000 UTC m=+0.351971240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-226.ec2.internal,}" Apr 16 18:17:04.266982 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.266961 2566 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:04.267249 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.267226 2566 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 18:17:04.267450 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.267437 2566 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 18:17:04.268690 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.268665 2566 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 18:17:04.268690 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.268689 2566 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 18:17:04.268843 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.268814 2566 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 18:17:04.268902 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.268862 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.268950 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.268902 2566 reconstruct.go:97] "Volume reconstruction finished" Apr 16 18:17:04.268950 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.268912 2566 reconciler.go:26] "Reconciler: start to sync state" Apr 16 18:17:04.269142 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269128 2566 factory.go:55] Registering systemd factory Apr 16 18:17:04.269198 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269179 2566 factory.go:223] Registration of the systemd container factory successfully Apr 16 18:17:04.269396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269385 2566 factory.go:153] Registering CRI-O factory Apr 16 18:17:04.269396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269396 2566 factory.go:223] Registration of the crio container factory successfully Apr 16 18:17:04.269496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269456 2566 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 18:17:04.269496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269483 2566 factory.go:103] Registering Raw factory Apr 16 18:17:04.269496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.269493 2566 manager.go:1196] Started watching for new ooms in manager Apr 16 18:17:04.270048 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.270028 2566 manager.go:319] Starting recovery of all containers Apr 16 18:17:04.279393 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.279366 2566 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 18:17:04.279494 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.279433 2566 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-136-226.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 18:17:04.279671 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.279640 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 18:17:04.283318 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.283301 2566 manager.go:324] Recovery completed Apr 16 18:17:04.287422 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.287409 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.289757 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.289741 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.289829 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.289774 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.289829 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.289789 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.290324 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.290311 2566 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 18:17:04.290324 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.290324 2566 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 18:17:04.290416 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.290339 2566 state_mem.go:36] "Initialized new in-memory state store" Apr 16 18:17:04.292595 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.292583 2566 policy_none.go:49] "None policy: Start" Apr 16 18:17:04.292644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.292598 2566 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 18:17:04.292644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.292607 2566 state_mem.go:35] "Initializing new in-memory state store" Apr 16 18:17:04.293299 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.293229 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-226.ec2.internal.18a6e91f70c99791 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-226.ec2.internal,UID:ip-10-0-136-226.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-136-226.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-136-226.ec2.internal,},FirstTimestamp:2026-04-16 18:17:04.289757073 +0000 UTC m=+0.382045033,LastTimestamp:2026-04-16 18:17:04.289757073 +0000 UTC m=+0.382045033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-226.ec2.internal,}" Apr 16 18:17:04.305555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.305437 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9pgt9" Apr 16 18:17:04.306600 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.306542 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-226.ec2.internal.18a6e91f70c9f5d1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-226.ec2.internal,UID:ip-10-0-136-226.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-136-226.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-136-226.ec2.internal,},FirstTimestamp:2026-04-16 18:17:04.289781201 +0000 UTC m=+0.382069159,LastTimestamp:2026-04-16 18:17:04.289781201 +0000 UTC m=+0.382069159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-226.ec2.internal,}" Apr 16 18:17:04.318058 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.317974 2566 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-136-226.ec2.internal.18a6e91f70ca2b5f default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-136-226.ec2.internal,UID:ip-10-0-136-226.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-136-226.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-136-226.ec2.internal,},FirstTimestamp:2026-04-16 18:17:04.289794911 +0000 UTC m=+0.382082868,LastTimestamp:2026-04-16 18:17:04.289794911 +0000 UTC m=+0.382082868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-136-226.ec2.internal,}" Apr 16 18:17:04.318555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.318535 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9pgt9" Apr 16 18:17:04.338619 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.338602 2566 manager.go:341] "Starting Device Plugin manager" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.338640 2566 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.338651 2566 server.go:85] "Starting device plugin registration server" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.338905 2566 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.338917 2566 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.339051 2566 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.339130 2566 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.339139 2566 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.339703 2566 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 18:17:04.350684 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.339731 2566 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.439645 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.439574 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.440582 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.440559 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.440667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.440589 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.440667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.440601 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.440667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.440633 2566 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.447445 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.447425 2566 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 18:17:04.447520 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.447456 2566 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 18:17:04.447520 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.447477 2566 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 18:17:04.447520 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.447487 2566 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 18:17:04.447648 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.447524 2566 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 18:17:04.450190 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.450167 2566 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.450190 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.450188 2566 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-136-226.ec2.internal\": node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.451250 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.451235 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:04.477440 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.477417 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.548456 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.548423 2566 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal"] Apr 16 18:17:04.548573 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.548542 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.549562 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.549544 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.549666 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.549576 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.549666 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.549589 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.550700 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.550688 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.550838 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.550822 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.550910 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.550859 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.551461 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551445 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.551545 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551447 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.551545 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551510 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.551545 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551521 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.551545 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551474 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.551682 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.551549 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.553085 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.553072 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.553134 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.553099 2566 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 18:17:04.553812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.553799 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientMemory" Apr 16 18:17:04.553880 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.553822 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 18:17:04.553880 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.553835 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeHasSufficientPID" Apr 16 18:17:04.570365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.570335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.570446 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.570370 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.570446 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.570392 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/772bd1ceeee63d6bb44b05ffabbad76d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-226.ec2.internal\" (UID: \"772bd1ceeee63d6bb44b05ffabbad76d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.577533 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.577513 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.577683 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.577656 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-226.ec2.internal\" not found" node="ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.581810 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.581793 2566 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-136-226.ec2.internal\" not found" node="ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671542 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671512 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671542 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671540 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671558 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/772bd1ceeee63d6bb44b05ffabbad76d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-226.ec2.internal\" (UID: \"772bd1ceeee63d6bb44b05ffabbad76d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671592 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671624 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e14ab5a37170f1f8894dc8ae352b1322-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal\" (UID: \"e14ab5a37170f1f8894dc8ae352b1322\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.671714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.671654 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/772bd1ceeee63d6bb44b05ffabbad76d-config\") pod \"kube-apiserver-proxy-ip-10-0-136-226.ec2.internal\" (UID: \"772bd1ceeee63d6bb44b05ffabbad76d\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.678630 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.678599 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.779383 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.779320 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.879923 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.879891 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:04.881020 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.880988 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.884613 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:04.884597 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:04.980776 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:04.980738 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.081258 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.081192 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.169108 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.169085 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:05.182188 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.182166 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.193328 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.193304 2566 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 18:17:05.193465 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.193449 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:05.193518 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.193467 2566 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 18:17:05.267872 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.267852 2566 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 18:17:05.282518 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.282496 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.287371 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.287327 2566 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 18:17:05.314684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.314660 2566 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xs2sz" Apr 16 18:17:05.320817 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.320776 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 18:12:04 +0000 UTC" deadline="2027-09-24 02:44:48.953928457 +0000 UTC" Apr 16 18:17:05.320817 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.320815 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12608h27m43.633116942s" Apr 16 18:17:05.324474 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.324457 2566 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xs2sz" Apr 16 18:17:05.343855 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.343827 2566 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:05.349981 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:05.349942 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode14ab5a37170f1f8894dc8ae352b1322.slice/crio-fe2e9b365227f611ae26fff2011e58260a544e0f9b1116664e5bf6ffe00f49e9 WatchSource:0}: Error finding container fe2e9b365227f611ae26fff2011e58260a544e0f9b1116664e5bf6ffe00f49e9: Status 404 returned error can't find the container with id fe2e9b365227f611ae26fff2011e58260a544e0f9b1116664e5bf6ffe00f49e9 Apr 16 18:17:05.350243 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:05.350222 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod772bd1ceeee63d6bb44b05ffabbad76d.slice/crio-8940db22c2719deb6a69d96bedc52d6a40a8419fdd138eb11c1774660666379d WatchSource:0}: Error finding container 8940db22c2719deb6a69d96bedc52d6a40a8419fdd138eb11c1774660666379d: Status 404 returned error can't find the container with id 8940db22c2719deb6a69d96bedc52d6a40a8419fdd138eb11c1774660666379d Apr 16 18:17:05.355125 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.355110 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:17:05.383589 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.383568 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.450652 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.450605 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" event={"ID":"e14ab5a37170f1f8894dc8ae352b1322","Type":"ContainerStarted","Data":"fe2e9b365227f611ae26fff2011e58260a544e0f9b1116664e5bf6ffe00f49e9"} Apr 16 18:17:05.451551 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.451522 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" event={"ID":"772bd1ceeee63d6bb44b05ffabbad76d","Type":"ContainerStarted","Data":"8940db22c2719deb6a69d96bedc52d6a40a8419fdd138eb11c1774660666379d"} Apr 16 18:17:05.484020 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.484003 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.584457 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:05.584434 2566 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-136-226.ec2.internal\" not found" Apr 16 18:17:05.652340 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.652284 2566 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:05.669374 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.669357 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" Apr 16 18:17:05.684797 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.684776 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:05.685642 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.685629 2566 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" Apr 16 18:17:05.694552 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:05.694531 2566 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 18:17:06.250660 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.250590 2566 apiserver.go:52] "Watching apiserver" Apr 16 18:17:06.257277 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.257254 2566 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 18:17:06.257646 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.257616 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-6tlbq","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj","openshift-cluster-node-tuning-operator/tuned-gs5pw","openshift-image-registry/node-ca-hp46j","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal","openshift-multus/multus-additional-cni-plugins-fl2g2","openshift-multus/multus-gs6xw","openshift-multus/network-metrics-daemon-jj9db","kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal","openshift-network-diagnostics/network-check-target-52dbn","openshift-network-operator/iptables-alerter-p2gt8","openshift-ovn-kubernetes/ovnkube-node-kxwr5"] Apr 16 18:17:06.259697 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.259679 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.260844 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.260820 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.261859 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.261840 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.262802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.262742 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 18:17:06.262802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.262763 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-nk5z8\"" Apr 16 18:17:06.262935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.262848 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.263210 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263194 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 18:17:06.263303 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263236 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 18:17:06.263705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263610 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.263705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263615 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.263705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263650 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-fwp8m\"" Apr 16 18:17:06.264007 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.263957 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.264548 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.264219 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.264548 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.264246 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.264948 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.264927 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-t4g87\"" Apr 16 18:17:06.265236 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.265187 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 18:17:06.265340 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.265254 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.265671 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.265607 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.265753 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.265675 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.265753 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.265740 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-zzn7w\"" Apr 16 18:17:06.266643 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.266625 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.266726 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.266640 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.266726 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.266676 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 18:17:06.266726 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.266681 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-d4rm7\"" Apr 16 18:17:06.268766 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.268078 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 18:17:06.268766 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.268338 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kfgpq\"" Apr 16 18:17:06.268766 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.268484 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 18:17:06.268766 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.268648 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 18:17:06.270431 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.270211 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:06.270431 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.270401 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:06.271836 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.271814 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.271923 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.271889 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:06.271923 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.271912 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.273481 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.273459 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.274188 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.274168 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.274433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.274415 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.274433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.274432 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-5cb9n\"" Apr 16 18:17:06.274582 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.274420 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 18:17:06.275922 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.275902 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 18:17:06.276035 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.275934 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-pr74h\"" Apr 16 18:17:06.276165 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.276135 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 18:17:06.276226 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.276184 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 18:17:06.276332 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.276313 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 18:17:06.276531 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.276514 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 18:17:06.276531 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.276526 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 18:17:06.279215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279170 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysconfig\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279203 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-cnibin\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279226 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bztns\" (UniqueName: \"kubernetes.io/projected/25aae314-4a74-4705-b118-50fda5694b79-kube-api-access-bztns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279253 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1c68ba07-dba4-4de5-923b-da334bafc1fb-konnectivity-ca\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279277 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-kubernetes\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279301 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-conf\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279322 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-run\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-etc-tuned\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-system-cni-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279389 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-system-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279431 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279485 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-bin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279531 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-conf-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279561 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-netns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279582 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-device-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279622 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-lib-modules\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279655 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-k8s-cni-cncf-io\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279681 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1c68ba07-dba4-4de5-923b-da334bafc1fb-agent-certs\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279722 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-systemd\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279758 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-sys\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279786 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279810 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-cnibin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279835 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-multus-certs\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279867 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.279907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279889 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-host\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279911 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbnqs\" (UniqueName: \"kubernetes.io/projected/aceee498-5925-409d-b85d-233e32fb5593-kube-api-access-hbnqs\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279936 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt9dz\" (UniqueName: \"kubernetes.io/projected/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-kube-api-access-jt9dz\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.279981 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-os-release\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280029 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280054 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-sys-fs\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280079 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-os-release\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280103 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-multus-daemon-config\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-socket-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280146 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-etc-selinux\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280170 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pj6qp\" (UniqueName: \"kubernetes.io/projected/d1962cad-7d60-4a0c-9f80-27474c5ef678-kube-api-access-pj6qp\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280195 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280220 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-serviceca\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280244 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280270 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj79l\" (UniqueName: \"kubernetes.io/projected/bd001d43-c6f4-44f4-906e-c01f02068004-kube-api-access-bj79l\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280298 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:06.280382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280323 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280343 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-multus\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280356 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-tmp\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280370 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-cni-binary-copy\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280383 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-socket-dir-parent\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280400 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-hostroot\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280431 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-registration-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280486 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-etc-kubernetes\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280536 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mt7xt\" (UniqueName: \"kubernetes.io/projected/b9fbc6e2-6448-4213-ac02-c0df39de143e-kube-api-access-mt7xt\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280560 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-kubelet\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280584 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-modprobe-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280608 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-var-lib-kubelet\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.281019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.280630 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-host\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.325643 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.325611 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:05 +0000 UTC" deadline="2027-12-17 21:28:20.971504673 +0000 UTC" Apr 16 18:17:06.325643 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.325641 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14643h11m14.645867378s" Apr 16 18:17:06.370328 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.370297 2566 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 18:17:06.381774 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381732 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-socket-dir-parent\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.381919 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381776 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-hostroot\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.381919 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381805 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a759571b-0c88-4edc-829c-d1cdc47b056f-iptables-alerter-script\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.381919 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381851 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gd66z\" (UniqueName: \"kubernetes.io/projected/a759571b-0c88-4edc-829c-d1cdc47b056f-kube-api-access-gd66z\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.381919 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381873 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-hostroot\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.381919 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381875 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-registration-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381938 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-registration-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381954 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-socket-dir-parent\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.381944 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbp8x\" (UniqueName: \"kubernetes.io/projected/dca169f9-fe56-4084-aff9-5a447ae82401-kube-api-access-jbp8x\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382098 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-etc-kubernetes\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mt7xt\" (UniqueName: \"kubernetes.io/projected/b9fbc6e2-6448-4213-ac02-c0df39de143e-kube-api-access-mt7xt\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.382200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-kubelet\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382235 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-etc-kubernetes\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382249 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-kubelet\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382366 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a759571b-0c88-4edc-829c-d1cdc47b056f-host-slash\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382402 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-ovn\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382431 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-modprobe-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.382467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382461 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-var-lib-kubelet\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-host\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382517 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-host\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382542 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-var-lib-kubelet\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382537 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382598 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-modprobe-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysconfig\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382640 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-cnibin\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382664 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bztns\" (UniqueName: \"kubernetes.io/projected/25aae314-4a74-4705-b118-50fda5694b79-kube-api-access-bztns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382695 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-env-overrides\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1c68ba07-dba4-4de5-923b-da334bafc1fb-konnectivity-ca\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.382737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382731 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysconfig\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382745 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-kubernetes\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382733 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-cnibin\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-conf\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382812 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-run\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382927 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-conf\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382947 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-etc-tuned\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.382965 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-system-cni-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383003 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-system-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383044 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-system-cni-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383053 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-system-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383082 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-bin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383107 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-conf-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383150 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-kubelet\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383173 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-slash\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-systemd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.383292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383224 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-log-socket\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383254 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-netns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383289 2566 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383301 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-netns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383315 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-kubernetes\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383347 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-var-lib-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383374 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-device-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383387 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-run\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383392 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1c68ba07-dba4-4de5-923b-da334bafc1fb-konnectivity-ca\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383405 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-bin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383410 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-lib-modules\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383416 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-cni-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383428 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-multus-conf-dir\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-device-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383444 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-k8s-cni-cncf-io\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383481 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-bin\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383484 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-k8s-cni-cncf-io\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383560 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1c68ba07-dba4-4de5-923b-da334bafc1fb-agent-certs\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.384055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383563 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-lib-modules\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383591 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-systemd\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383628 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-sys\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383635 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-systemd\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383686 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-sys\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383689 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383748 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-cnibin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383774 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-multus-certs\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383803 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-cnibin\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383820 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383845 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-host\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383855 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-run-multus-certs\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383871 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hbnqs\" (UniqueName: \"kubernetes.io/projected/aceee498-5925-409d-b85d-233e32fb5593-kube-api-access-hbnqs\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383910 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jt9dz\" (UniqueName: \"kubernetes.io/projected/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-kube-api-access-jt9dz\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383917 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-host\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383973 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/aceee498-5925-409d-b85d-233e32fb5593-etc-sysctl-d\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.383981 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-os-release\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384024 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.384884 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384061 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-sys-fs\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-os-release\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384124 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-kubelet-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384128 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-multus-daemon-config\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-os-release\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384184 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-sys-fs\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384226 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-os-release\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384212 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-systemd-units\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-netns\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384290 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384315 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384344 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-socket-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384381 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-etc-selinux\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384453 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pj6qp\" (UniqueName: \"kubernetes.io/projected/d1962cad-7d60-4a0c-9f80-27474c5ef678-kube-api-access-pj6qp\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384472 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b9fbc6e2-6448-4213-ac02-c0df39de143e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384481 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-etc-selinux\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384491 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.385685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384490 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d1962cad-7d60-4a0c-9f80-27474c5ef678-socket-dir\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384529 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-node-log\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384559 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-script-lib\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384590 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-serviceca\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384622 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bj79l\" (UniqueName: \"kubernetes.io/projected/bd001d43-c6f4-44f4-906e-c01f02068004-kube-api-access-bj79l\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384681 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-config\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384793 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dca169f9-fe56-4084-aff9-5a447ae82401-ovn-node-metrics-cert\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.384867 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-multus-daemon-config\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.384927 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.385041 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.884975519 +0000 UTC m=+2.977263479 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385120 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-serviceca\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385145 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-etc-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385178 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385223 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.386362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385234 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-multus\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385264 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-netd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385305 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-tmp\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385330 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/25aae314-4a74-4705-b118-50fda5694b79-host-var-lib-cni-multus\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385380 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-cni-binary-copy\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385704 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b9fbc6e2-6448-4213-ac02-c0df39de143e-cni-binary-copy\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.385917 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/25aae314-4a74-4705-b118-50fda5694b79-cni-binary-copy\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.386863 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-etc-tuned\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.387146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.387103 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1c68ba07-dba4-4de5-923b-da334bafc1fb-agent-certs\") pod \"konnectivity-agent-6tlbq\" (UID: \"1c68ba07-dba4-4de5-923b-da334bafc1fb\") " pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.387750 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.387444 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aceee498-5925-409d-b85d-233e32fb5593-tmp\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.392289 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.392248 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bztns\" (UniqueName: \"kubernetes.io/projected/25aae314-4a74-4705-b118-50fda5694b79-kube-api-access-bztns\") pod \"multus-gs6xw\" (UID: \"25aae314-4a74-4705-b118-50fda5694b79\") " pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.396521 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.395876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mt7xt\" (UniqueName: \"kubernetes.io/projected/b9fbc6e2-6448-4213-ac02-c0df39de143e-kube-api-access-mt7xt\") pod \"multus-additional-cni-plugins-fl2g2\" (UID: \"b9fbc6e2-6448-4213-ac02-c0df39de143e\") " pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.397110 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.397068 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pj6qp\" (UniqueName: \"kubernetes.io/projected/d1962cad-7d60-4a0c-9f80-27474c5ef678-kube-api-access-pj6qp\") pod \"aws-ebs-csi-driver-node-kztnj\" (UID: \"d1962cad-7d60-4a0c-9f80-27474c5ef678\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.397570 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.397093 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt9dz\" (UniqueName: \"kubernetes.io/projected/b06d5b5e-fa9c-4211-acc1-3b2c5f851673-kube-api-access-jt9dz\") pod \"node-ca-hp46j\" (UID: \"b06d5b5e-fa9c-4211-acc1-3b2c5f851673\") " pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.400503 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.400483 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:06.400589 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.400507 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:06.400589 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.400521 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:06.400685 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.400620 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:06.90059904 +0000 UTC m=+2.992886999 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:06.401085 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.401065 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbnqs\" (UniqueName: \"kubernetes.io/projected/aceee498-5925-409d-b85d-233e32fb5593-kube-api-access-hbnqs\") pod \"tuned-gs5pw\" (UID: \"aceee498-5925-409d-b85d-233e32fb5593\") " pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.405832 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.405813 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj79l\" (UniqueName: \"kubernetes.io/projected/bd001d43-c6f4-44f4-906e-c01f02068004-kube-api-access-bj79l\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.486656 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486625 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-systemd-units\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486656 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486663 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-netns\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486681 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486727 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486755 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-netns\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486734 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-systemd-units\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486772 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-node-log\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486802 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-script-lib\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486803 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.486866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486821 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-run-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486885 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-node-log\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-config\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.486965 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dca169f9-fe56-4084-aff9-5a447ae82401-ovn-node-metrics-cert\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487011 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-etc-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487036 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-netd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487059 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a759571b-0c88-4edc-829c-d1cdc47b056f-iptables-alerter-script\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gd66z\" (UniqueName: \"kubernetes.io/projected/a759571b-0c88-4edc-829c-d1cdc47b056f-kube-api-access-gd66z\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jbp8x\" (UniqueName: \"kubernetes.io/projected/dca169f9-fe56-4084-aff9-5a447ae82401-kube-api-access-jbp8x\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487158 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-netd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487193 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a759571b-0c88-4edc-829c-d1cdc47b056f-host-slash\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.487227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487217 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-ovn\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487244 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-env-overrides\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487330 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-kubelet\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487364 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-slash\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487389 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-systemd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487393 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-script-lib\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487419 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-log-socket\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487446 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-var-lib-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487456 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487478 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-bin\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487495 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-etc-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487538 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-ovnkube-config\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-log-socket\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487554 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-systemd\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487551 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-var-lib-openvswitch\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487596 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-kubelet\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.487705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487604 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a759571b-0c88-4edc-829c-d1cdc47b056f-host-slash\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.488517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487619 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-run-ovn\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.488517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487632 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-slash\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.488517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487643 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/dca169f9-fe56-4084-aff9-5a447ae82401-host-cni-bin\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.488517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487654 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a759571b-0c88-4edc-829c-d1cdc47b056f-iptables-alerter-script\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.488517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.487907 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/dca169f9-fe56-4084-aff9-5a447ae82401-env-overrides\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.489644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.489623 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/dca169f9-fe56-4084-aff9-5a447ae82401-ovn-node-metrics-cert\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.496841 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.496822 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbp8x\" (UniqueName: \"kubernetes.io/projected/dca169f9-fe56-4084-aff9-5a447ae82401-kube-api-access-jbp8x\") pod \"ovnkube-node-kxwr5\" (UID: \"dca169f9-fe56-4084-aff9-5a447ae82401\") " pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.497132 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.497112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gd66z\" (UniqueName: \"kubernetes.io/projected/a759571b-0c88-4edc-829c-d1cdc47b056f-kube-api-access-gd66z\") pod \"iptables-alerter-p2gt8\" (UID: \"a759571b-0c88-4edc-829c-d1cdc47b056f\") " pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.529094 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.529016 2566 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:06.572352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.572321 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:06.579082 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.579059 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" Apr 16 18:17:06.587737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.587716 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" Apr 16 18:17:06.592278 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.592260 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-hp46j" Apr 16 18:17:06.598850 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.598832 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" Apr 16 18:17:06.604717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.604697 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-gs6xw" Apr 16 18:17:06.613231 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.613209 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p2gt8" Apr 16 18:17:06.618845 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.618830 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:06.890093 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.889982 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:06.890257 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.890116 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:06.890257 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.890190 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:07.890171552 +0000 UTC m=+3.982459497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:06.947383 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.947358 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb06d5b5e_fa9c_4211_acc1_3b2c5f851673.slice/crio-4394f7e06ae171f2f4b2325a416e80d5aecfd27928b090989cb336834937fba5 WatchSource:0}: Error finding container 4394f7e06ae171f2f4b2325a416e80d5aecfd27928b090989cb336834937fba5: Status 404 returned error can't find the container with id 4394f7e06ae171f2f4b2325a416e80d5aecfd27928b090989cb336834937fba5 Apr 16 18:17:06.948210 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.948106 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddca169f9_fe56_4084_aff9_5a447ae82401.slice/crio-dd48cd2d0ab59491fb4b5bb853dc6ec2b68f2948c2041f990b8f0ba3c345f403 WatchSource:0}: Error finding container dd48cd2d0ab59491fb4b5bb853dc6ec2b68f2948c2041f990b8f0ba3c345f403: Status 404 returned error can't find the container with id dd48cd2d0ab59491fb4b5bb853dc6ec2b68f2948c2041f990b8f0ba3c345f403 Apr 16 18:17:06.949806 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.949775 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25aae314_4a74_4705_b118_50fda5694b79.slice/crio-edd0bee0441c82b96851bbb7e4c8ed21ada2f055447a9f1b617cd03bd7301ba8 WatchSource:0}: Error finding container edd0bee0441c82b96851bbb7e4c8ed21ada2f055447a9f1b617cd03bd7301ba8: Status 404 returned error can't find the container with id edd0bee0441c82b96851bbb7e4c8ed21ada2f055447a9f1b617cd03bd7301ba8 Apr 16 18:17:06.950625 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.950504 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda759571b_0c88_4edc_829c_d1cdc47b056f.slice/crio-c994f2f29df8f134885dada7be4ff9b4ee7a3a913d30e6648017dba9289d4826 WatchSource:0}: Error finding container c994f2f29df8f134885dada7be4ff9b4ee7a3a913d30e6648017dba9289d4826: Status 404 returned error can't find the container with id c994f2f29df8f134885dada7be4ff9b4ee7a3a913d30e6648017dba9289d4826 Apr 16 18:17:06.953075 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.952961 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9fbc6e2_6448_4213_ac02_c0df39de143e.slice/crio-98a5a64feb51df971b4802dd6c476363e0c05fcc6c2f6277ce09f1064c8b9038 WatchSource:0}: Error finding container 98a5a64feb51df971b4802dd6c476363e0c05fcc6c2f6277ce09f1064c8b9038: Status 404 returned error can't find the container with id 98a5a64feb51df971b4802dd6c476363e0c05fcc6c2f6277ce09f1064c8b9038 Apr 16 18:17:06.953844 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.953820 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c68ba07_dba4_4de5_923b_da334bafc1fb.slice/crio-1eeeb3c917ea148b8b3e3a00e43edbcd461d4ad023a2734f55d2af91dd7f9bd6 WatchSource:0}: Error finding container 1eeeb3c917ea148b8b3e3a00e43edbcd461d4ad023a2734f55d2af91dd7f9bd6: Status 404 returned error can't find the container with id 1eeeb3c917ea148b8b3e3a00e43edbcd461d4ad023a2734f55d2af91dd7f9bd6 Apr 16 18:17:06.955202 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.955181 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1962cad_7d60_4a0c_9f80_27474c5ef678.slice/crio-897c98387d62463bd47711bdf2f3124d0776a7483ebc9a34b5a272d0352703e2 WatchSource:0}: Error finding container 897c98387d62463bd47711bdf2f3124d0776a7483ebc9a34b5a272d0352703e2: Status 404 returned error can't find the container with id 897c98387d62463bd47711bdf2f3124d0776a7483ebc9a34b5a272d0352703e2 Apr 16 18:17:06.957460 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:06.957356 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaceee498_5925_409d_b85d_233e32fb5593.slice/crio-6c83bad36142142cdc0e2f97be2c37d7beb05571b35306db05021dc9408078bd WatchSource:0}: Error finding container 6c83bad36142142cdc0e2f97be2c37d7beb05571b35306db05021dc9408078bd: Status 404 returned error can't find the container with id 6c83bad36142142cdc0e2f97be2c37d7beb05571b35306db05021dc9408078bd Apr 16 18:17:06.990916 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:06.990892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:06.991062 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.991047 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:06.991114 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.991067 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:06.991114 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.991077 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:06.991201 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:06.991118 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:07.991106124 +0000 UTC m=+4.083394066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:07.327178 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.327083 2566 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 18:12:05 +0000 UTC" deadline="2027-11-10 06:17:04.271756423 +0000 UTC" Apr 16 18:17:07.327178 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.327123 2566 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13739h59m56.94463735s" Apr 16 18:17:07.464754 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.464673 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6tlbq" event={"ID":"1c68ba07-dba4-4de5-923b-da334bafc1fb","Type":"ContainerStarted","Data":"1eeeb3c917ea148b8b3e3a00e43edbcd461d4ad023a2734f55d2af91dd7f9bd6"} Apr 16 18:17:07.469792 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.469727 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" event={"ID":"d1962cad-7d60-4a0c-9f80-27474c5ef678","Type":"ContainerStarted","Data":"897c98387d62463bd47711bdf2f3124d0776a7483ebc9a34b5a272d0352703e2"} Apr 16 18:17:07.474064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.474035 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerStarted","Data":"98a5a64feb51df971b4802dd6c476363e0c05fcc6c2f6277ce09f1064c8b9038"} Apr 16 18:17:07.490461 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.490401 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"dd48cd2d0ab59491fb4b5bb853dc6ec2b68f2948c2041f990b8f0ba3c345f403"} Apr 16 18:17:07.499391 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.498851 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" event={"ID":"772bd1ceeee63d6bb44b05ffabbad76d","Type":"ContainerStarted","Data":"0335e1cd2545f97669d0fa3aa2a039f05d22934b98ce06b48c6007cab51f483b"} Apr 16 18:17:07.509504 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.509474 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p2gt8" event={"ID":"a759571b-0c88-4edc-829c-d1cdc47b056f","Type":"ContainerStarted","Data":"c994f2f29df8f134885dada7be4ff9b4ee7a3a913d30e6648017dba9289d4826"} Apr 16 18:17:07.513955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.512806 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gs6xw" event={"ID":"25aae314-4a74-4705-b118-50fda5694b79","Type":"ContainerStarted","Data":"edd0bee0441c82b96851bbb7e4c8ed21ada2f055447a9f1b617cd03bd7301ba8"} Apr 16 18:17:07.513955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.513912 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp46j" event={"ID":"b06d5b5e-fa9c-4211-acc1-3b2c5f851673","Type":"ContainerStarted","Data":"4394f7e06ae171f2f4b2325a416e80d5aecfd27928b090989cb336834937fba5"} Apr 16 18:17:07.529928 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.528640 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" event={"ID":"aceee498-5925-409d-b85d-233e32fb5593","Type":"ContainerStarted","Data":"6c83bad36142142cdc0e2f97be2c37d7beb05571b35306db05021dc9408078bd"} Apr 16 18:17:07.731094 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.729955 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-136-226.ec2.internal" podStartSLOduration=2.729934556 podStartE2EDuration="2.729934556s" podCreationTimestamp="2026-04-16 18:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:07.527399416 +0000 UTC m=+3.619687384" watchObservedRunningTime="2026-04-16 18:17:07.729934556 +0000 UTC m=+3.822222525" Apr 16 18:17:07.731094 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.730840 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-wtxhb"] Apr 16 18:17:07.734885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.734530 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.734885 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:07.734616 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:07.798915 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.798887 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-kubelet-config\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.799050 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.798961 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.799050 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.799020 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-dbus\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.899766 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.899727 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-kubelet-config\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.899929 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.899817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.899929 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.899852 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:07.899929 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.899878 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-dbus\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.900141 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-dbus\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:07.900219 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5504c48e-268b-4506-8112-5817466db907-kubelet-config\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:07.900318 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:07.900374 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:08.40035646 +0000 UTC m=+4.492644409 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:07.900633 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:07.900705 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:07.900674 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:09.900661191 +0000 UTC m=+5.992949140 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:08.001558 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.000870 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:08.001558 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.001065 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:08.001558 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.001090 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:08.001558 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.001102 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:08.001558 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.001159 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:10.001141487 +0000 UTC m=+6.093429445 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:08.039760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.039567 2566 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 18:17:08.403813 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.403736 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:08.404265 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.403899 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:08.404265 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.403958 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:09.403940447 +0000 UTC m=+5.496228393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:08.449685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.448914 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:08.449685 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.449055 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:08.449685 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.449489 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:08.449685 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:08.449615 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:08.544421 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.544382 2566 generic.go:358] "Generic (PLEG): container finished" podID="e14ab5a37170f1f8894dc8ae352b1322" containerID="2684346a5d634c4a0d6c435d70be07cc1f4cc061ff6884bb56d04c78475f6b59" exitCode=0 Apr 16 18:17:08.545452 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:08.545422 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" event={"ID":"e14ab5a37170f1f8894dc8ae352b1322","Type":"ContainerDied","Data":"2684346a5d634c4a0d6c435d70be07cc1f4cc061ff6884bb56d04c78475f6b59"} Apr 16 18:17:09.412671 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:09.412631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:09.413195 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:09.412762 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:09.413195 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:09.412841 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:11.412821134 +0000 UTC m=+7.505109082 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:09.447954 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:09.447921 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:09.448141 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:09.448079 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:09.549635 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:09.549599 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" event={"ID":"e14ab5a37170f1f8894dc8ae352b1322","Type":"ContainerStarted","Data":"880ef33df4dfae6311d02545e04fbcf6c3b0d68bd332c812323cb5d94d08ad5a"} Apr 16 18:17:09.917017 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:09.916960 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:09.917196 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:09.917134 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:09.917256 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:09.917219 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:13.917198919 +0000 UTC m=+10.009486864 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:10.018106 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:10.018010 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:10.018272 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.018120 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:10.018272 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.018141 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:10.018272 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.018154 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:10.018272 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.018214 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:14.018195927 +0000 UTC m=+10.110483875 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:10.450444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:10.449283 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:10.450444 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.449418 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:10.450444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:10.449892 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:10.450444 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:10.449986 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:11.431329 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:11.430692 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:11.431329 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:11.430885 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:11.431329 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:11.430953 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:15.430933926 +0000 UTC m=+11.523221870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:11.448341 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:11.448315 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:11.448496 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:11.448437 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:12.448478 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:12.448345 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:12.448478 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:12.448345 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:12.448936 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:12.448479 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:12.448936 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:12.448549 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:13.447922 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:13.447888 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:13.448104 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:13.448038 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:13.952793 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:13.952734 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:13.953209 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:13.952899 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:13.953209 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:13.952972 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:21.952951774 +0000 UTC m=+18.045239735 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:14.053967 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:14.053893 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:14.054154 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.054049 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:14.054154 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.054074 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:14.054154 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.054089 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:14.054154 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.054152 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:22.054133665 +0000 UTC m=+18.146421628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:14.449045 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:14.449006 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:14.449232 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.449140 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:14.449494 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:14.449475 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:14.449636 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:14.449593 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:15.448357 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:15.448320 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:15.448793 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:15.448452 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:15.465196 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:15.465161 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:15.465341 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:15.465319 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:15.465394 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:15.465383 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:23.465363598 +0000 UTC m=+19.557651547 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:16.447751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:16.447718 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:16.447951 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:16.447718 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:16.447951 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:16.447868 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:16.447951 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:16.447914 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:17.448187 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:17.448119 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:17.448564 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:17.448264 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:18.448313 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:18.448280 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:18.448731 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:18.448406 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:18.448731 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:18.448465 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:18.448731 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:18.448592 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:19.448558 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:19.448511 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:19.449040 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:19.448653 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:20.448163 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:20.448132 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:20.448345 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:20.448238 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:20.448407 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:20.448342 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:20.448511 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:20.448470 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:21.448594 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:21.448557 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:21.449168 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:21.448688 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:22.017740 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:22.017704 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:22.017897 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.017856 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:22.017941 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.017921 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.017906855 +0000 UTC m=+34.110194799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 18:17:22.118252 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:22.118214 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:22.118430 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.118368 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 18:17:22.118430 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.118381 2566 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 18:17:22.118430 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.118392 2566 projected.go:194] Error preparing data for projected volume kube-api-access-fgl76 for pod openshift-network-diagnostics/network-check-target-52dbn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:22.118566 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.118451 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76 podName:3dd548ef-63ff-4ea7-825d-0fa73a6487db nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.118434381 +0000 UTC m=+34.210722347 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-fgl76" (UniqueName: "kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76") pod "network-check-target-52dbn" (UID: "3dd548ef-63ff-4ea7-825d-0fa73a6487db") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 18:17:22.447977 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:22.447944 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:22.448153 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:22.448008 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:22.448153 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.448103 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:22.448268 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:22.448233 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:23.448616 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:23.448583 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:23.448975 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:23.448688 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:23.527236 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:23.527215 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:23.527327 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:23.527315 2566 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:23.527374 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:23.527362 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret podName:5504c48e-268b-4506-8112-5817466db907 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:39.527350411 +0000 UTC m=+35.619638354 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret") pod "global-pull-secret-syncer-wtxhb" (UID: "5504c48e-268b-4506-8112-5817466db907") : object "kube-system"/"original-pull-secret" not registered Apr 16 18:17:24.449094 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.448732 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:24.449907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.448794 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:24.449907 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:24.449198 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:24.449907 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:24.449241 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:24.569398 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.569371 2566 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 18:17:24.575762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575726 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"c2b4217ebd01f76befb30514e4bb94fd1b0c905c52768d3137b2915798b509d5"} Apr 16 18:17:24.575911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575770 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"dc6fa749618160a1ab0364c2ecc6c1ddd62f167c8775813deac6d1e2d5b5e572"} Apr 16 18:17:24.575911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575782 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"a58942787b4f0f683b5233bbbff734c349ef40a030deaf7eeb9e368f98af5c83"} Apr 16 18:17:24.575911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575790 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"70434774a7a12249f23070eb5a03c77c66ada3d2beb8f6a6b34bafe5e064f2fe"} Apr 16 18:17:24.575911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575802 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"58c1f9a21679ebd7e9f56289df500b2d5a0d05fcfd1339dafc915b01f3cebb2c"} Apr 16 18:17:24.575911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.575812 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"b2ce93d751082f19f5a51921e172c0c371041ba725918d5d1acf9a8dac563e74"} Apr 16 18:17:24.577005 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.576971 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-gs6xw" event={"ID":"25aae314-4a74-4705-b118-50fda5694b79","Type":"ContainerStarted","Data":"6b5d92bd43c30a47389e23bc7bad4dab855bfed22230ddaa3203bdfc2328cf43"} Apr 16 18:17:24.578261 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.578241 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-hp46j" event={"ID":"b06d5b5e-fa9c-4211-acc1-3b2c5f851673","Type":"ContainerStarted","Data":"9b8b0b58019f9fd3bad8b78287fb54d25c705a7924bf3ffac0fd6c42ba4e4ba4"} Apr 16 18:17:24.579423 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.579406 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" event={"ID":"aceee498-5925-409d-b85d-233e32fb5593","Type":"ContainerStarted","Data":"d5c73c2c13466aa2ff82a98aab4d9c1e9524f2b2ffb8fd3ef9cb63e6e2463102"} Apr 16 18:17:24.580608 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.580592 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-6tlbq" event={"ID":"1c68ba07-dba4-4de5-923b-da334bafc1fb","Type":"ContainerStarted","Data":"6328c285b499c84155ff2cda0101423ee79f88adb22042ba8b93ccc27a37d330"} Apr 16 18:17:24.582075 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.582058 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" event={"ID":"d1962cad-7d60-4a0c-9f80-27474c5ef678","Type":"ContainerStarted","Data":"0bc6eeb3effee20749f2e5d1fa03da42adec8604a6cde9def4425cf930faa195"} Apr 16 18:17:24.582153 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.582078 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" event={"ID":"d1962cad-7d60-4a0c-9f80-27474c5ef678","Type":"ContainerStarted","Data":"ca2194c28e297097ac8478ba8389d887fd49f88927ba1487f74ed28649450663"} Apr 16 18:17:24.583302 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.583283 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="4f07dc406e594bd9295c1f9a2e00b1fea9efc30602d4ef155a15f00c755bcaed" exitCode=0 Apr 16 18:17:24.583377 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.583308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"4f07dc406e594bd9295c1f9a2e00b1fea9efc30602d4ef155a15f00c755bcaed"} Apr 16 18:17:24.595039 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.595008 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-136-226.ec2.internal" podStartSLOduration=19.59498115 podStartE2EDuration="19.59498115s" podCreationTimestamp="2026-04-16 18:17:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:09.566101737 +0000 UTC m=+5.658389703" watchObservedRunningTime="2026-04-16 18:17:24.59498115 +0000 UTC m=+20.687269114" Apr 16 18:17:24.595163 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.595142 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-gs6xw" podStartSLOduration=4.006392421 podStartE2EDuration="20.595137096s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.95261627 +0000 UTC m=+3.044904224" lastFinishedPulling="2026-04-16 18:17:23.541360942 +0000 UTC m=+19.633648899" observedRunningTime="2026-04-16 18:17:24.594581178 +0000 UTC m=+20.686869143" watchObservedRunningTime="2026-04-16 18:17:24.595137096 +0000 UTC m=+20.687425060" Apr 16 18:17:24.607983 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.607949 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-6tlbq" podStartSLOduration=4.065559329 podStartE2EDuration="20.607938793s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.956082395 +0000 UTC m=+3.048370351" lastFinishedPulling="2026-04-16 18:17:23.498461861 +0000 UTC m=+19.590749815" observedRunningTime="2026-04-16 18:17:24.607651678 +0000 UTC m=+20.699939841" watchObservedRunningTime="2026-04-16 18:17:24.607938793 +0000 UTC m=+20.700226758" Apr 16 18:17:24.620504 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:24.620469 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-hp46j" podStartSLOduration=8.547652455 podStartE2EDuration="20.620459752s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.949124501 +0000 UTC m=+3.041412443" lastFinishedPulling="2026-04-16 18:17:19.021931794 +0000 UTC m=+15.114219740" observedRunningTime="2026-04-16 18:17:24.620240431 +0000 UTC m=+20.712528408" watchObservedRunningTime="2026-04-16 18:17:24.620459752 +0000 UTC m=+20.712747717" Apr 16 18:17:25.350426 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.350274 2566 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T18:17:24.569386578Z","UUID":"6ea77650-0df6-461b-b313-6387dad3d06a","Handler":null,"Name":"","Endpoint":""} Apr 16 18:17:25.352860 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.352367 2566 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 18:17:25.352860 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.352409 2566 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 18:17:25.448188 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.448149 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:25.448364 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:25.448268 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:25.587107 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.587070 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p2gt8" event={"ID":"a759571b-0c88-4edc-829c-d1cdc47b056f","Type":"ContainerStarted","Data":"70e376c36d9d9b1c1969099ccffe60e62d0bc9d201a23c30bb9f5bf4e6b01519"} Apr 16 18:17:25.606682 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.606638 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p2gt8" podStartSLOduration=5.060655484 podStartE2EDuration="21.60662678s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.952589547 +0000 UTC m=+3.044877516" lastFinishedPulling="2026-04-16 18:17:23.498560864 +0000 UTC m=+19.590848812" observedRunningTime="2026-04-16 18:17:25.606483521 +0000 UTC m=+21.698771487" watchObservedRunningTime="2026-04-16 18:17:25.60662678 +0000 UTC m=+21.698914744" Apr 16 18:17:25.606984 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:25.606948 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-gs5pw" podStartSLOduration=5.025448701 podStartE2EDuration="21.606941091s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.95955886 +0000 UTC m=+3.051846817" lastFinishedPulling="2026-04-16 18:17:23.54105125 +0000 UTC m=+19.633339207" observedRunningTime="2026-04-16 18:17:24.654907838 +0000 UTC m=+20.747195803" watchObservedRunningTime="2026-04-16 18:17:25.606941091 +0000 UTC m=+21.699229058" Apr 16 18:17:26.447707 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:26.447677 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:26.447914 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:26.447677 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:26.447914 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:26.447798 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:26.447914 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:26.447869 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:26.591508 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:26.591468 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" event={"ID":"d1962cad-7d60-4a0c-9f80-27474c5ef678","Type":"ContainerStarted","Data":"44d5221af01c855140012039f9e11080bd61b3ea08f7af10d487f795c9ee830c"} Apr 16 18:17:26.594908 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:26.594879 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"a5862a14c35632512b6927a8671d1867a5ec386e6f39f32166b38371bb096914"} Apr 16 18:17:26.622118 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:26.622010 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-kztnj" podStartSLOduration=4.0730938309999996 podStartE2EDuration="22.621981571s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.957021162 +0000 UTC m=+3.049309120" lastFinishedPulling="2026-04-16 18:17:25.505908906 +0000 UTC m=+21.598196860" observedRunningTime="2026-04-16 18:17:26.621616371 +0000 UTC m=+22.713904336" watchObservedRunningTime="2026-04-16 18:17:26.621981571 +0000 UTC m=+22.714269537" Apr 16 18:17:27.053973 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.053899 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-5psgq"] Apr 16 18:17:27.060385 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.060362 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.063191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.063144 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 18:17:27.063191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.063190 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 18:17:27.063371 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.063255 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-824f6\"" Apr 16 18:17:27.154366 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.154337 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/125ba5ab-da90-4b8d-b93b-56e647e63aff-hosts-file\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.154525 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.154475 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/125ba5ab-da90-4b8d-b93b-56e647e63aff-tmp-dir\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.154525 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.154508 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmq5\" (UniqueName: \"kubernetes.io/projected/125ba5ab-da90-4b8d-b93b-56e647e63aff-kube-api-access-vmmq5\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.255409 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.255373 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/125ba5ab-da90-4b8d-b93b-56e647e63aff-tmp-dir\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.255574 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.255415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmq5\" (UniqueName: \"kubernetes.io/projected/125ba5ab-da90-4b8d-b93b-56e647e63aff-kube-api-access-vmmq5\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.255574 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.255451 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/125ba5ab-da90-4b8d-b93b-56e647e63aff-hosts-file\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.255574 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.255531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/125ba5ab-da90-4b8d-b93b-56e647e63aff-hosts-file\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.255724 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.255687 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/125ba5ab-da90-4b8d-b93b-56e647e63aff-tmp-dir\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.269542 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.269510 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmq5\" (UniqueName: \"kubernetes.io/projected/125ba5ab-da90-4b8d-b93b-56e647e63aff-kube-api-access-vmmq5\") pod \"node-resolver-5psgq\" (UID: \"125ba5ab-da90-4b8d-b93b-56e647e63aff\") " pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.370128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.370089 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5psgq" Apr 16 18:17:27.448057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:27.448035 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:27.448210 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:27.448128 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:28.448370 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.448203 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:28.448716 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.448273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:28.448716 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:28.448432 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:28.448716 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:28.448519 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:28.464518 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:28.464490 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod125ba5ab_da90_4b8d_b93b_56e647e63aff.slice/crio-22bf20dd7c6810619090f1fb49c888d57aa2298eccb4db1eb91c158854e63e1d WatchSource:0}: Error finding container 22bf20dd7c6810619090f1fb49c888d57aa2298eccb4db1eb91c158854e63e1d: Status 404 returned error can't find the container with id 22bf20dd7c6810619090f1fb49c888d57aa2298eccb4db1eb91c158854e63e1d Apr 16 18:17:28.603130 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.603023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" event={"ID":"dca169f9-fe56-4084-aff9-5a447ae82401","Type":"ContainerStarted","Data":"8abcb08ce1c6ed0be30bd9ec1b31db7edd5a06dd869d2e2fd97c8ffb0d3a467a"} Apr 16 18:17:28.603506 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.603340 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:28.603506 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.603363 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:28.604880 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.604611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5psgq" event={"ID":"125ba5ab-da90-4b8d-b93b-56e647e63aff","Type":"ContainerStarted","Data":"503b4632de0403c1b356c1e6eef3863972a277dcd51c4ec75b1dfd0906b0b912"} Apr 16 18:17:28.604880 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.604636 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5psgq" event={"ID":"125ba5ab-da90-4b8d-b93b-56e647e63aff","Type":"ContainerStarted","Data":"22bf20dd7c6810619090f1fb49c888d57aa2298eccb4db1eb91c158854e63e1d"} Apr 16 18:17:28.617899 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.617830 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:28.632561 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:28.632520 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" podStartSLOduration=7.828504423 podStartE2EDuration="24.63250896s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.949951637 +0000 UTC m=+3.042239583" lastFinishedPulling="2026-04-16 18:17:23.753956163 +0000 UTC m=+19.846244120" observedRunningTime="2026-04-16 18:17:28.631933859 +0000 UTC m=+24.724221823" watchObservedRunningTime="2026-04-16 18:17:28.63250896 +0000 UTC m=+24.724796925" Apr 16 18:17:29.448671 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.448489 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:29.449421 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:29.448748 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:29.472134 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.472108 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:29.472658 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.472638 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:29.490194 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.490161 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5psgq" podStartSLOduration=2.490148486 podStartE2EDuration="2.490148486s" podCreationTimestamp="2026-04-16 18:17:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:17:28.694226119 +0000 UTC m=+24.786514085" watchObservedRunningTime="2026-04-16 18:17:29.490148486 +0000 UTC m=+25.582436450" Apr 16 18:17:29.607445 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.607407 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="f7bed4bc0f2e761d33694234ced560978d43e6c88029e8aae492d1b5847792e3" exitCode=0 Apr 16 18:17:29.607612 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.607465 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"f7bed4bc0f2e761d33694234ced560978d43e6c88029e8aae492d1b5847792e3"} Apr 16 18:17:29.607722 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.607700 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:29.608684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.608191 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:29.608684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.608227 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-6tlbq" Apr 16 18:17:29.622326 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:29.622309 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:17:30.448648 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.448572 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:30.448787 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.448572 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:30.448787 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:30.448715 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:30.449234 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:30.448784 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:30.508713 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.508687 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52dbn"] Apr 16 18:17:30.514948 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.514921 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wtxhb"] Apr 16 18:17:30.515082 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.515044 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:30.515153 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:30.515134 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:30.528243 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.528213 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jj9db"] Apr 16 18:17:30.610564 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.610532 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="cf7f56d852a86666603cc2a0f9d4ad1ca1ff189cb92d845c17c93779ce32c55e" exitCode=0 Apr 16 18:17:30.610728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.610637 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"cf7f56d852a86666603cc2a0f9d4ad1ca1ff189cb92d845c17c93779ce32c55e"} Apr 16 18:17:30.610728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.610665 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:30.611138 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:30.611092 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:30.611138 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:30.611121 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:30.611309 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:30.611220 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:31.613626 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:31.613598 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="66a63719049cd4ca510b2d02b22ba17f03f5eadef94d3d9671f8fdc845d56e0f" exitCode=0 Apr 16 18:17:31.613973 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:31.613679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"66a63719049cd4ca510b2d02b22ba17f03f5eadef94d3d9671f8fdc845d56e0f"} Apr 16 18:17:32.447952 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:32.447910 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:32.448139 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:32.448052 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:32.448139 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:32.448062 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:32.448248 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:32.448176 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:32.448248 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:32.448229 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:32.448339 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:32.448297 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:34.449406 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:34.449363 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:34.449924 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:34.449493 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:34.449924 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:34.449509 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-52dbn" podUID="3dd548ef-63ff-4ea7-825d-0fa73a6487db" Apr 16 18:17:34.449924 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:34.449570 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-wtxhb" podUID="5504c48e-268b-4506-8112-5817466db907" Apr 16 18:17:34.449924 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:34.449586 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:34.449924 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:34.449715 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:17:36.264965 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.264758 2566 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-136-226.ec2.internal" event="NodeReady" Apr 16 18:17:36.265400 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.265120 2566 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 18:17:36.305736 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.305695 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9bcb99cb4-2lzkn"] Apr 16 18:17:36.340876 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.340847 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm"] Apr 16 18:17:36.341064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.341016 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.343576 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.343547 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 18:17:36.343709 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.343651 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-59g6h\"" Apr 16 18:17:36.343799 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.343780 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 18:17:36.343868 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.343835 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 18:17:36.348709 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.348685 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 18:17:36.365184 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.365162 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2zc2n"] Apr 16 18:17:36.365337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.365318 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.367820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.367801 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 18:17:36.367947 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.367845 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 18:17:36.367947 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.367801 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nk7cg\"" Apr 16 18:17:36.389351 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.389290 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cpbgb"] Apr 16 18:17:36.389479 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.389459 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.392090 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.392069 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 18:17:36.392090 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.392085 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:17:36.392251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.392119 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 18:17:36.415160 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.415132 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm"] Apr 16 18:17:36.415160 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.415161 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9bcb99cb4-2lzkn"] Apr 16 18:17:36.415306 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.415174 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cpbgb"] Apr 16 18:17:36.415306 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.415188 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2zc2n"] Apr 16 18:17:36.415306 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.415298 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.417913 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.417894 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 18:17:36.418039 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.417933 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 18:17:36.418205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.418190 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:17:36.418293 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.418248 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 18:17:36.430672 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430639 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.430858 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430704 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.430858 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430749 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.430858 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430774 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.430858 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430813 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.431111 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430861 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.431111 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.430909 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vlp\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.431111 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.431024 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.448242 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.448218 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:36.448347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.448218 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:36.448898 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.448224 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:36.451081 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451065 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 18:17:36.451174 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451117 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-v57pk\"" Apr 16 18:17:36.451240 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451177 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:17:36.451240 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451118 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 18:17:36.451384 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451364 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 18:17:36.451429 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.451411 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 18:17:36.532126 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532087 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532138 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532169 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.532287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532196 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532220 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df9zl\" (UniqueName: \"kubernetes.io/projected/6b838b63-87c9-46b4-96a1-ed246b230c36-kube-api-access-df9zl\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.532544 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532400 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532544 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532466 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532544 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532498 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.532544 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532539 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42w55\" (UniqueName: \"kubernetes.io/projected/4a945a62-4bc4-4f09-8555-50569018d9ac-kube-api-access-42w55\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8vlp\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532597 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532617 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7de8316-4440-4039-ae31-310a6c1146a9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532642 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532670 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a945a62-4bc4-4f09-8555-50569018d9ac-tmp-dir\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.532734 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532708 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a945a62-4bc4-4f09-8555-50569018d9ac-config-volume\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.533057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532743 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.533057 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.532877 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:36.533057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.532879 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.533057 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.532893 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:36.533057 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.532966 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:37.032943805 +0000 UTC m=+33.125231750 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:36.533434 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.533408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.536783 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.536762 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.536892 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.536760 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.545390 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.545363 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.545508 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.545418 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8vlp\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:36.633958 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.633929 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42w55\" (UniqueName: \"kubernetes.io/projected/4a945a62-4bc4-4f09-8555-50569018d9ac-kube-api-access-42w55\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.634133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.633974 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7de8316-4440-4039-ae31-310a6c1146a9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.634133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634006 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.634133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634032 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a945a62-4bc4-4f09-8555-50569018d9ac-tmp-dir\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.634133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634063 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a945a62-4bc4-4f09-8555-50569018d9ac-config-volume\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634142 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634160 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634199 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-df9zl\" (UniqueName: \"kubernetes.io/projected/6b838b63-87c9-46b4-96a1-ed246b230c36-kube-api-access-df9zl\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634225 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:17:37.13420237 +0000 UTC m=+33.226490315 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634266 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634339 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:36.634396 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634364 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:37.134356499 +0000 UTC m=+33.226644442 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:36.634725 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634507 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:36.634725 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:36.634578 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:37.134553573 +0000 UTC m=+33.226841516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:17:36.634725 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634686 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/4a945a62-4bc4-4f09-8555-50569018d9ac-tmp-dir\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.634881 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.634753 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7de8316-4440-4039-ae31-310a6c1146a9-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:36.643336 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.643283 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42w55\" (UniqueName: \"kubernetes.io/projected/4a945a62-4bc4-4f09-8555-50569018d9ac-kube-api-access-42w55\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:36.643472 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.643451 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-df9zl\" (UniqueName: \"kubernetes.io/projected/6b838b63-87c9-46b4-96a1-ed246b230c36-kube-api-access-df9zl\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:36.645341 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:36.645321 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a945a62-4bc4-4f09-8555-50569018d9ac-config-volume\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:37.037394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:37.037310 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:37.037548 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.037484 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:37.037548 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.037508 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:37.037661 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.037588 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.037566443 +0000 UTC m=+34.129854388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:37.138335 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:37.138298 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:37.138335 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:37.138343 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:37.138526 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138447 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:37.138526 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138504 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.138489899 +0000 UTC m=+34.230777846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:37.138610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:37.138535 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:37.138610 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138558 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:37.138610 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138601 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.138590199 +0000 UTC m=+34.230878142 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:37.138713 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138612 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:37.138713 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:37.138634 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:38.138627505 +0000 UTC m=+34.230915448 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:17:38.045518 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.045437 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.045547 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.045599 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.045648 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.045660 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.045671 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:10.045648795 +0000 UTC m=+66.137936740 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : secret "metrics-daemon-secret" not found Apr 16 18:17:38.046038 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.045692 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.045681506 +0000 UTC m=+36.137969454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:38.146661 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.146627 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:38.146802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.146678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:38.146802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.146750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:38.146802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.146778 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:38.146802 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146782 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:38.146973 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146851 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:38.146973 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146851 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:38.146973 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146858 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.146839335 +0000 UTC m=+36.239127296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:38.146973 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146922 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.146908216 +0000 UTC m=+36.239196166 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:17:38.146973 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:38.146942 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:17:40.146933859 +0000 UTC m=+36.239221804 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:38.149224 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.149204 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgl76\" (UniqueName: \"kubernetes.io/projected/3dd548ef-63ff-4ea7-825d-0fa73a6487db-kube-api-access-fgl76\") pod \"network-check-target-52dbn\" (UID: \"3dd548ef-63ff-4ea7-825d-0fa73a6487db\") " pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:38.259368 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.259334 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:38.426245 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.426212 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-52dbn"] Apr 16 18:17:38.430803 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:38.430782 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dd548ef_63ff_4ea7_825d_0fa73a6487db.slice/crio-1140ca545fa4f8650591560cb2a357aaba28260da57462cca91f48a66da84953 WatchSource:0}: Error finding container 1140ca545fa4f8650591560cb2a357aaba28260da57462cca91f48a66da84953: Status 404 returned error can't find the container with id 1140ca545fa4f8650591560cb2a357aaba28260da57462cca91f48a66da84953 Apr 16 18:17:38.628987 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.628753 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52dbn" event={"ID":"3dd548ef-63ff-4ea7-825d-0fa73a6487db","Type":"ContainerStarted","Data":"1140ca545fa4f8650591560cb2a357aaba28260da57462cca91f48a66da84953"} Apr 16 18:17:38.631033 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.631011 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="58322dc06c941e71c9ccbf030e8f2c327f9e4ae85ba78a958624806e31c160bf" exitCode=0 Apr 16 18:17:38.631133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:38.631056 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"58322dc06c941e71c9ccbf030e8f2c327f9e4ae85ba78a958624806e31c160bf"} Apr 16 18:17:39.557287 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.557246 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:39.560975 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.560946 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5504c48e-268b-4506-8112-5817466db907-original-pull-secret\") pod \"global-pull-secret-syncer-wtxhb\" (UID: \"5504c48e-268b-4506-8112-5817466db907\") " pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:39.636788 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.636736 2566 generic.go:358] "Generic (PLEG): container finished" podID="b9fbc6e2-6448-4213-ac02-c0df39de143e" containerID="db38996d600403f521e986b25dbb7a785a70a19f76eddd435402b1214aa0562b" exitCode=0 Apr 16 18:17:39.636788 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.636777 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerDied","Data":"db38996d600403f521e986b25dbb7a785a70a19f76eddd435402b1214aa0562b"} Apr 16 18:17:39.766940 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.766914 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-wtxhb" Apr 16 18:17:39.895969 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:39.895936 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-wtxhb"] Apr 16 18:17:39.900497 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:17:39.900469 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5504c48e_268b_4506_8112_5817466db907.slice/crio-460722b26427874d01fe549984dbd42478f0fcf134632e4572eee5bf64b32efb WatchSource:0}: Error finding container 460722b26427874d01fe549984dbd42478f0fcf134632e4572eee5bf64b32efb: Status 404 returned error can't find the container with id 460722b26427874d01fe549984dbd42478f0fcf134632e4572eee5bf64b32efb Apr 16 18:17:40.062389 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.062356 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:40.062544 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.062527 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:40.062609 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.062550 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:40.062609 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.062607 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:44.062592118 +0000 UTC m=+40.154880062 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:40.163408 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.163365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:40.163564 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.163469 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:40.163564 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.163497 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:40.163564 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163535 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:40.163699 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163610 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:40.163699 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163625 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:17:44.163589906 +0000 UTC m=+40.255877855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:40.163699 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163667 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:44.163655024 +0000 UTC m=+40.255942968 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:40.163699 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163610 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:40.163898 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:40.163706 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:44.16369569 +0000 UTC m=+40.255983639 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:17:40.642933 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.642887 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" event={"ID":"b9fbc6e2-6448-4213-ac02-c0df39de143e","Type":"ContainerStarted","Data":"64b47c8055bf6e9da9917192ad7f31e423af396d8fdedd723fc794a4c3a7e603"} Apr 16 18:17:40.644078 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.644047 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wtxhb" event={"ID":"5504c48e-268b-4506-8112-5817466db907","Type":"ContainerStarted","Data":"460722b26427874d01fe549984dbd42478f0fcf134632e4572eee5bf64b32efb"} Apr 16 18:17:40.669767 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:40.669491 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fl2g2" podStartSLOduration=6.150918952 podStartE2EDuration="36.669468668s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:06.955045808 +0000 UTC m=+3.047333766" lastFinishedPulling="2026-04-16 18:17:37.473595539 +0000 UTC m=+33.565883482" observedRunningTime="2026-04-16 18:17:40.667137322 +0000 UTC m=+36.759425287" watchObservedRunningTime="2026-04-16 18:17:40.669468668 +0000 UTC m=+36.761756634" Apr 16 18:17:42.649598 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:42.649566 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-52dbn" event={"ID":"3dd548ef-63ff-4ea7-825d-0fa73a6487db","Type":"ContainerStarted","Data":"9e62e6b422d64ce28d903b28ff048b90d03af7f1174da64feeb554816064c814"} Apr 16 18:17:42.650018 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:42.649789 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:17:42.667094 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:42.667040 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-52dbn" podStartSLOduration=34.877565939 podStartE2EDuration="38.667023594s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:17:38.432599683 +0000 UTC m=+34.524887639" lastFinishedPulling="2026-04-16 18:17:42.222057351 +0000 UTC m=+38.314345294" observedRunningTime="2026-04-16 18:17:42.666523638 +0000 UTC m=+38.758811606" watchObservedRunningTime="2026-04-16 18:17:42.667023594 +0000 UTC m=+38.759311559" Apr 16 18:17:44.096669 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.096631 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:44.097043 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.096783 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:44.097043 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.096803 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:44.097043 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.096852 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.09683719 +0000 UTC m=+48.189125134 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:44.197885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.197853 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:44.197885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.197892 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:44.198079 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.197916 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:44.198079 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198028 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:44.198079 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198034 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:44.198079 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198071 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.198058845 +0000 UTC m=+48.290346793 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:44.198221 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198122 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.198082666 +0000 UTC m=+48.290370610 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:17:44.198221 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198034 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:44.198221 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:44.198159 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:17:52.19814994 +0000 UTC m=+48.290437883 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:44.655020 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.654915 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-wtxhb" event={"ID":"5504c48e-268b-4506-8112-5817466db907","Type":"ContainerStarted","Data":"d7295f44c6b83dc42183b6961e6f4355ea773e79c1c7ebc14b85f3cfbf919555"} Apr 16 18:17:44.670573 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:44.670528 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-wtxhb" podStartSLOduration=33.225211435 podStartE2EDuration="37.670514416s" podCreationTimestamp="2026-04-16 18:17:07 +0000 UTC" firstStartedPulling="2026-04-16 18:17:39.90287182 +0000 UTC m=+35.995159763" lastFinishedPulling="2026-04-16 18:17:44.348174787 +0000 UTC m=+40.440462744" observedRunningTime="2026-04-16 18:17:44.669967675 +0000 UTC m=+40.762255640" watchObservedRunningTime="2026-04-16 18:17:44.670514416 +0000 UTC m=+40.762802429" Apr 16 18:17:52.151852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:52.151808 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:17:52.152292 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.151921 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:17:52.152292 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.151932 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:17:52.152292 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.151979 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:18:08.151965705 +0000 UTC m=+64.244253647 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:17:52.252530 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:52.252495 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:17:52.252718 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:52.252541 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:17:52.252718 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:17:52.252596 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:17:52.252718 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252673 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:17:52.252718 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252695 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:17:52.252898 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252678 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:17:52.252898 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252760 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:08.252739934 +0000 UTC m=+64.345027878 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:17:52.252898 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252782 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:18:08.25277232 +0000 UTC m=+64.345060266 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:17:52.252898 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:17:52.252803 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:08.252794154 +0000 UTC m=+64.345082105 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:18:01.626561 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:01.626531 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kxwr5" Apr 16 18:18:08.163237 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:08.163195 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:18:08.163654 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.163313 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:08.163654 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.163326 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:18:08.163654 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.163379 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.163362278 +0000 UTC m=+96.255650221 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:18:08.264317 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:08.264281 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:18:08.264317 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:08.264321 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:08.264348 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264430 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264432 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264430 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264479 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.264465843 +0000 UTC m=+96.356753786 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264492 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.264485817 +0000 UTC m=+96.356773760 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:18:08.264513 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:08.264506 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:18:40.264498304 +0000 UTC m=+96.356786254 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:18:10.077151 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:10.077114 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:18:10.077502 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:10.077226 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:18:10.077502 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:10.077277 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:14.077262838 +0000 UTC m=+130.169550781 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : secret "metrics-daemon-secret" not found Apr 16 18:18:13.654124 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:13.654096 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-52dbn" Apr 16 18:18:40.194508 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:40.194468 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:18:40.194912 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.194612 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:18:40.194912 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.194632 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:18:40.194912 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.194703 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:19:44.194685464 +0000 UTC m=+160.286973411 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:18:40.295412 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:40.295325 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:18:40.295412 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:40.295395 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:18:40.295416 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295477 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295502 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295522 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295551 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:19:44.295534515 +0000 UTC m=+160.387822457 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295574 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:44.295559836 +0000 UTC m=+160.387847778 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:18:40.295612 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:18:40.295587 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:19:44.295580793 +0000 UTC m=+160.387868735 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:19:14.146682 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:14.146638 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:19:14.147189 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:14.146797 2566 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 18:19:14.147189 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:14.146872 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs podName:bd001d43-c6f4-44f4-906e-c01f02068004 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:16.146856328 +0000 UTC m=+252.239144276 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs") pod "network-metrics-daemon-jj9db" (UID: "bd001d43-c6f4-44f4-906e-c01f02068004") : secret "metrics-daemon-secret" not found Apr 16 18:19:39.353438 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:39.353380 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" podUID="19763774-6747-497c-913a-f8852a4e5a0d" Apr 16 18:19:39.375047 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:39.375021 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" podUID="c7de8316-4440-4039-ae31-310a6c1146a9" Apr 16 18:19:39.398608 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:39.398583 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2zc2n" podUID="4a945a62-4bc4-4f09-8555-50569018d9ac" Apr 16 18:19:39.424246 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:39.424222 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cpbgb" podUID="6b838b63-87c9-46b4-96a1-ed246b230c36" Apr 16 18:19:39.472078 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:39.472050 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-jj9db" podUID="bd001d43-c6f4-44f4-906e-c01f02068004" Apr 16 18:19:39.867849 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:39.867824 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:19:39.868011 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:39.867857 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:19:39.868011 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:39.867964 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:19:44.256711 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:44.256676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") pod \"image-registry-9bcb99cb4-2lzkn\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:19:44.257134 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.256776 2566 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 18:19:44.257134 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.256787 2566 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-9bcb99cb4-2lzkn: secret "image-registry-tls" not found Apr 16 18:19:44.257134 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.256828 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls podName:19763774-6747-497c-913a-f8852a4e5a0d nodeName:}" failed. No retries permitted until 2026-04-16 18:21:46.256815313 +0000 UTC m=+282.349103257 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls") pod "image-registry-9bcb99cb4-2lzkn" (UID: "19763774-6747-497c-913a-f8852a4e5a0d") : secret "image-registry-tls" not found Apr 16 18:19:44.357711 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:44.357677 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:19:44.357876 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:44.357720 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:19:44.357876 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:44.357764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:19:44.357876 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357806 2566 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 18:19:44.357876 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357871 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert podName:c7de8316-4440-4039-ae31-310a6c1146a9 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:46.357854223 +0000 UTC m=+282.450142166 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert") pod "networking-console-plugin-5cb6cf4cb4-r2ztm" (UID: "c7de8316-4440-4039-ae31-310a6c1146a9") : secret "networking-console-plugin-cert" not found Apr 16 18:19:44.358101 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357882 2566 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 18:19:44.358101 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357921 2566 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 18:19:44.358101 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357939 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert podName:6b838b63-87c9-46b4-96a1-ed246b230c36 nodeName:}" failed. No retries permitted until 2026-04-16 18:21:46.357922876 +0000 UTC m=+282.450210841 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert") pod "ingress-canary-cpbgb" (UID: "6b838b63-87c9-46b4-96a1-ed246b230c36") : secret "canary-serving-cert" not found Apr 16 18:19:44.358101 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:19:44.357967 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls podName:4a945a62-4bc4-4f09-8555-50569018d9ac nodeName:}" failed. No retries permitted until 2026-04-16 18:21:46.357956494 +0000 UTC m=+282.450244438 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls") pod "dns-default-2zc2n" (UID: "4a945a62-4bc4-4f09-8555-50569018d9ac") : secret "dns-default-metrics-tls" not found Apr 16 18:19:50.447934 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:50.447890 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:19:52.447888 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:52.447850 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:19:53.012881 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:53.012853 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5psgq_125ba5ab-da90-4b8d-b93b-56e647e63aff/dns-node-resolver/0.log" Apr 16 18:19:54.013460 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:19:54.013393 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hp46j_b06d5b5e-fa9c-4211-acc1-3b2c5f851673/node-ca/0.log" Apr 16 18:20:15.254284 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.254245 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dz7b9"] Apr 16 18:20:15.257423 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.257401 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.265709 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.265689 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 18:20:15.265811 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.265692 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 18:20:15.266566 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.266549 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 18:20:15.266667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.266582 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hm4gx\"" Apr 16 18:20:15.266809 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.266796 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 18:20:15.278414 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.278388 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dz7b9"] Apr 16 18:20:15.378200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.378170 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.378365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.378229 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-crio-socket\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.378365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.378335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-data-volume\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.378365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.378358 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.378505 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.378396 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz2pv\" (UniqueName: \"kubernetes.io/projected/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-api-access-bz2pv\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479473 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479435 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bz2pv\" (UniqueName: \"kubernetes.io/projected/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-api-access-bz2pv\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479473 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479482 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479614 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-crio-socket\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479710 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-data-volume\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479732 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.479820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.479729 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-crio-socket\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.480033 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.480018 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-data-volume\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.480235 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.480219 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.482374 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.482360 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.488927 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.488908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz2pv\" (UniqueName: \"kubernetes.io/projected/2021cc59-cdbb-4dd8-a90e-f8f2f331b558-kube-api-access-bz2pv\") pod \"insights-runtime-extractor-dz7b9\" (UID: \"2021cc59-cdbb-4dd8-a90e-f8f2f331b558\") " pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.565598 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.565531 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dz7b9" Apr 16 18:20:15.679460 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.679434 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dz7b9"] Apr 16 18:20:15.682867 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:15.682841 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2021cc59_cdbb_4dd8_a90e_f8f2f331b558.slice/crio-aab10598c03f0519bf7191dad5434e86eafd8c0b187f9db849ac062e8421a23d WatchSource:0}: Error finding container aab10598c03f0519bf7191dad5434e86eafd8c0b187f9db849ac062e8421a23d: Status 404 returned error can't find the container with id aab10598c03f0519bf7191dad5434e86eafd8c0b187f9db849ac062e8421a23d Apr 16 18:20:15.931953 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.931913 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dz7b9" event={"ID":"2021cc59-cdbb-4dd8-a90e-f8f2f331b558","Type":"ContainerStarted","Data":"137e205327af9d79538bb12d7c7c9db8c00945ee5e9ba008e2049d1d2d2ebd05"} Apr 16 18:20:15.931953 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:15.931952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dz7b9" event={"ID":"2021cc59-cdbb-4dd8-a90e-f8f2f331b558","Type":"ContainerStarted","Data":"aab10598c03f0519bf7191dad5434e86eafd8c0b187f9db849ac062e8421a23d"} Apr 16 18:20:16.935740 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:16.935703 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dz7b9" event={"ID":"2021cc59-cdbb-4dd8-a90e-f8f2f331b558","Type":"ContainerStarted","Data":"d38b9624077d63433cf344426bc2185c25c59b22d79a33fbee41a967bed8b749"} Apr 16 18:20:17.939057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:17.939020 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dz7b9" event={"ID":"2021cc59-cdbb-4dd8-a90e-f8f2f331b558","Type":"ContainerStarted","Data":"2f8a8ec7e0cc68e715371f1ae0d7b8f98ea2ee355e7bfae1dbe62882ea19f65d"} Apr 16 18:20:17.961078 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:17.961034 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dz7b9" podStartSLOduration=0.894173378 podStartE2EDuration="2.961022806s" podCreationTimestamp="2026-04-16 18:20:15 +0000 UTC" firstStartedPulling="2026-04-16 18:20:15.744177559 +0000 UTC m=+191.836465503" lastFinishedPulling="2026-04-16 18:20:17.811026984 +0000 UTC m=+193.903314931" observedRunningTime="2026-04-16 18:20:17.959812584 +0000 UTC m=+194.052100560" watchObservedRunningTime="2026-04-16 18:20:17.961022806 +0000 UTC m=+194.053310770" Apr 16 18:20:21.924964 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:21.924928 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm"] Apr 16 18:20:21.927750 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:21.927735 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:21.930303 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:21.930282 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 16 18:20:21.930419 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:21.930287 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-r8xtk\"" Apr 16 18:20:21.935716 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:21.935698 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm"] Apr 16 18:20:22.034278 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.034248 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/23e1b17b-ffd6-4cbd-acef-a6eae31b0283-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h5zzm\" (UID: \"23e1b17b-ffd6-4cbd-acef-a6eae31b0283\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:22.135524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.135489 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/23e1b17b-ffd6-4cbd-acef-a6eae31b0283-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h5zzm\" (UID: \"23e1b17b-ffd6-4cbd-acef-a6eae31b0283\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:22.137786 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.137754 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/23e1b17b-ffd6-4cbd-acef-a6eae31b0283-tls-certificates\") pod \"prometheus-operator-admission-webhook-9cb97cd87-h5zzm\" (UID: \"23e1b17b-ffd6-4cbd-acef-a6eae31b0283\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:22.236299 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.236219 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:22.347493 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.347466 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm"] Apr 16 18:20:22.350191 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:22.350166 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23e1b17b_ffd6_4cbd_acef_a6eae31b0283.slice/crio-9261a784470c3aa201c65081bd679793327f5647d8aed6c09a2c6f8ded5a8936 WatchSource:0}: Error finding container 9261a784470c3aa201c65081bd679793327f5647d8aed6c09a2c6f8ded5a8936: Status 404 returned error can't find the container with id 9261a784470c3aa201c65081bd679793327f5647d8aed6c09a2c6f8ded5a8936 Apr 16 18:20:22.947892 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:22.947852 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" event={"ID":"23e1b17b-ffd6-4cbd-acef-a6eae31b0283","Type":"ContainerStarted","Data":"9261a784470c3aa201c65081bd679793327f5647d8aed6c09a2c6f8ded5a8936"} Apr 16 18:20:23.951257 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:23.951168 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" event={"ID":"23e1b17b-ffd6-4cbd-acef-a6eae31b0283","Type":"ContainerStarted","Data":"cb0e726fb9e0d23fd2110579165d76833a42538251519bfceb98083662b0a3e6"} Apr 16 18:20:23.951676 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:23.951359 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:23.955823 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:23.955803 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" Apr 16 18:20:23.971253 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:23.971208 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-9cb97cd87-h5zzm" podStartSLOduration=1.64476514 podStartE2EDuration="2.971196146s" podCreationTimestamp="2026-04-16 18:20:21 +0000 UTC" firstStartedPulling="2026-04-16 18:20:22.352088858 +0000 UTC m=+198.444376800" lastFinishedPulling="2026-04-16 18:20:23.678519852 +0000 UTC m=+199.770807806" observedRunningTime="2026-04-16 18:20:23.97083459 +0000 UTC m=+200.063122556" watchObservedRunningTime="2026-04-16 18:20:23.971196146 +0000 UTC m=+200.063484118" Apr 16 18:20:24.994430 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:24.994397 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7dtds"] Apr 16 18:20:24.997439 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:24.997422 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.000316 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.000291 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 18:20:25.000440 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.000373 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 18:20:25.001391 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.001368 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 18:20:25.001485 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.001436 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-hcx9q\"" Apr 16 18:20:25.001485 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.001442 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 18:20:25.001757 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.001740 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 18:20:25.006407 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.006379 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7dtds"] Apr 16 18:20:25.055271 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.055235 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c759e2c5-22e6-4808-a546-dd9733343b92-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.055490 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.055296 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.055490 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.055381 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zql2x\" (UniqueName: \"kubernetes.io/projected/c759e2c5-22e6-4808-a546-dd9733343b92-kube-api-access-zql2x\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.055490 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.055433 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.156528 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.156492 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.156657 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.156554 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zql2x\" (UniqueName: \"kubernetes.io/projected/c759e2c5-22e6-4808-a546-dd9733343b92-kube-api-access-zql2x\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.156657 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.156604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.156657 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:20:25.156648 2566 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 18:20:25.156770 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.156669 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c759e2c5-22e6-4808-a546-dd9733343b92-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.156770 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:20:25.156717 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls podName:c759e2c5-22e6-4808-a546-dd9733343b92 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:25.656696484 +0000 UTC m=+201.748984449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls") pod "prometheus-operator-78f957474d-7dtds" (UID: "c759e2c5-22e6-4808-a546-dd9733343b92") : secret "prometheus-operator-tls" not found Apr 16 18:20:25.157366 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.157342 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c759e2c5-22e6-4808-a546-dd9733343b92-metrics-client-ca\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.159134 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.159109 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.166527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.166499 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zql2x\" (UniqueName: \"kubernetes.io/projected/c759e2c5-22e6-4808-a546-dd9733343b92-kube-api-access-zql2x\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.660895 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.660859 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.663158 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.663132 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/c759e2c5-22e6-4808-a546-dd9733343b92-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-7dtds\" (UID: \"c759e2c5-22e6-4808-a546-dd9733343b92\") " pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:25.906305 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:25.906258 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" Apr 16 18:20:26.020067 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:26.020036 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-7dtds"] Apr 16 18:20:26.022803 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:26.022779 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc759e2c5_22e6_4808_a546_dd9733343b92.slice/crio-12a4b83f518d21fde88927e77a60f3e0b0ef2dab530071d2396aad495bc138bb WatchSource:0}: Error finding container 12a4b83f518d21fde88927e77a60f3e0b0ef2dab530071d2396aad495bc138bb: Status 404 returned error can't find the container with id 12a4b83f518d21fde88927e77a60f3e0b0ef2dab530071d2396aad495bc138bb Apr 16 18:20:26.960947 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:26.960905 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" event={"ID":"c759e2c5-22e6-4808-a546-dd9733343b92","Type":"ContainerStarted","Data":"12a4b83f518d21fde88927e77a60f3e0b0ef2dab530071d2396aad495bc138bb"} Apr 16 18:20:27.964310 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:27.964228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" event={"ID":"c759e2c5-22e6-4808-a546-dd9733343b92","Type":"ContainerStarted","Data":"761b80a9ae2acf19768bf9bb769540e4fac766589ab7c8a8b17c419f7adbbacd"} Apr 16 18:20:27.964310 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:27.964268 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" event={"ID":"c759e2c5-22e6-4808-a546-dd9733343b92","Type":"ContainerStarted","Data":"74b27be8b33b441978e3265034215385bc1ee7f74614b5d324ffb7d436073457"} Apr 16 18:20:27.987291 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:27.984556 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-7dtds" podStartSLOduration=2.378270893 podStartE2EDuration="3.984539554s" podCreationTimestamp="2026-04-16 18:20:24 +0000 UTC" firstStartedPulling="2026-04-16 18:20:26.02458959 +0000 UTC m=+202.116877536" lastFinishedPulling="2026-04-16 18:20:27.630858241 +0000 UTC m=+203.723146197" observedRunningTime="2026-04-16 18:20:27.98279262 +0000 UTC m=+204.075080585" watchObservedRunningTime="2026-04-16 18:20:27.984539554 +0000 UTC m=+204.076827520" Apr 16 18:20:30.360463 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.360431 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z79hs"] Apr 16 18:20:30.363573 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.363558 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.371823 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.371798 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:20:30.372096 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.372076 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 18:20:30.372969 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.372948 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-fwfs6\"" Apr 16 18:20:30.381248 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.381224 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z79hs"] Apr 16 18:20:30.384349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.384324 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rt4w5"] Apr 16 18:20:30.387215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.387201 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.389850 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.389825 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 18:20:30.389954 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.389889 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 18:20:30.390038 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.389956 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 18:20:30.390233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.390218 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-fmxn9\"" Apr 16 18:20:30.399290 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.399270 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rt4w5"] Apr 16 18:20:30.413477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.413458 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-8jm7p"] Apr 16 18:20:30.416368 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.416355 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.418769 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.418754 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 18:20:30.418867 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.418792 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 18:20:30.419057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.419040 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-k6cqr\"" Apr 16 18:20:30.419105 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.419075 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 18:20:30.499657 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499626 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.499807 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-textfile\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.499807 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499686 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-wtmp\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.499807 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499704 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.499940 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499808 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.499940 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499845 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.499940 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499874 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04b00891-8f1a-4b47-bdb5-63b1933e788f-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.499940 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499915 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js54r\" (UniqueName: \"kubernetes.io/projected/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-api-access-js54r\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.499954 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500017 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500061 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-root\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500088 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-tls\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.500166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500147 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.500347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500198 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500224 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bkwm\" (UniqueName: \"kubernetes.io/projected/285a85b9-1863-44eb-9c99-d65ffce469c1-kube-api-access-9bkwm\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500255 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxbmq\" (UniqueName: \"kubernetes.io/projected/04b00891-8f1a-4b47-bdb5-63b1933e788f-kube-api-access-sxbmq\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.500347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500273 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-sys\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.500347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.500301 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-metrics-client-ca\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601105 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601070 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04b00891-8f1a-4b47-bdb5-63b1933e788f-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.601105 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601109 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js54r\" (UniqueName: \"kubernetes.io/projected/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-api-access-js54r\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601129 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601148 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601172 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-root\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601192 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-tls\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601215 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601235 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601265 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601271 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-root\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601297 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601292 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bkwm\" (UniqueName: \"kubernetes.io/projected/285a85b9-1863-44eb-9c99-d65ffce469c1-kube-api-access-9bkwm\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxbmq\" (UniqueName: \"kubernetes.io/projected/04b00891-8f1a-4b47-bdb5-63b1933e788f-kube-api-access-sxbmq\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601349 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-sys\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601390 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-metrics-client-ca\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601419 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601454 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-textfile\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-wtmp\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601508 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601616 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.601712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601654 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.602141 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601832 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-accelerators-collector-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.602141 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.601888 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/04b00891-8f1a-4b47-bdb5-63b1933e788f-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.602141 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602080 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.602292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602145 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.602343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602293 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-wtmp\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.602457 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602434 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/285a85b9-1863-44eb-9c99-d65ffce469c1-sys\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.602533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602501 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/285a85b9-1863-44eb-9c99-d65ffce469c1-metrics-client-ca\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.602533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.602521 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-textfile\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.602636 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:20:30.602588 2566 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 18:20:30.602689 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:20:30.602640 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls podName:4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579 nodeName:}" failed. No retries permitted until 2026-04-16 18:20:31.102623126 +0000 UTC m=+207.194911069 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-rt4w5" (UID: "4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579") : secret "kube-state-metrics-tls" not found Apr 16 18:20:30.603105 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.603083 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.604272 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.604248 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.604404 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.604388 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.604558 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.604542 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.604741 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.604722 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/285a85b9-1863-44eb-9c99-d65ffce469c1-node-exporter-tls\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.604790 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.604776 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/04b00891-8f1a-4b47-bdb5-63b1933e788f-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.612725 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.612660 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bkwm\" (UniqueName: \"kubernetes.io/projected/285a85b9-1863-44eb-9c99-d65ffce469c1-kube-api-access-9bkwm\") pod \"node-exporter-8jm7p\" (UID: \"285a85b9-1863-44eb-9c99-d65ffce469c1\") " pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.613006 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.612977 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxbmq\" (UniqueName: \"kubernetes.io/projected/04b00891-8f1a-4b47-bdb5-63b1933e788f-kube-api-access-sxbmq\") pod \"openshift-state-metrics-5669946b84-z79hs\" (UID: \"04b00891-8f1a-4b47-bdb5-63b1933e788f\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.613079 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.612987 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js54r\" (UniqueName: \"kubernetes.io/projected/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-api-access-js54r\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:30.671740 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.671719 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" Apr 16 18:20:30.724463 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.724435 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-8jm7p" Apr 16 18:20:30.733536 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:30.733487 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod285a85b9_1863_44eb_9c99_d65ffce469c1.slice/crio-63c7a0dc59339539ebceabe4d34ad2b492132d3804e25827cef3e680233374a3 WatchSource:0}: Error finding container 63c7a0dc59339539ebceabe4d34ad2b492132d3804e25827cef3e680233374a3: Status 404 returned error can't find the container with id 63c7a0dc59339539ebceabe4d34ad2b492132d3804e25827cef3e680233374a3 Apr 16 18:20:30.786257 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.786200 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-z79hs"] Apr 16 18:20:30.788446 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:30.788425 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04b00891_8f1a_4b47_bdb5_63b1933e788f.slice/crio-30533a8162fa3fba6809babeec72b0808dd43887c5f1f01f307be89f2ea0ed95 WatchSource:0}: Error finding container 30533a8162fa3fba6809babeec72b0808dd43887c5f1f01f307be89f2ea0ed95: Status 404 returned error can't find the container with id 30533a8162fa3fba6809babeec72b0808dd43887c5f1f01f307be89f2ea0ed95 Apr 16 18:20:30.972444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.972405 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" event={"ID":"04b00891-8f1a-4b47-bdb5-63b1933e788f","Type":"ContainerStarted","Data":"9db7a72c78c9971849c65310d1299c1a59082bbb50c8c74a630542070bfd87d9"} Apr 16 18:20:30.972444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.972448 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" event={"ID":"04b00891-8f1a-4b47-bdb5-63b1933e788f","Type":"ContainerStarted","Data":"248f6d9fda3c48f1a70f1f07502439f1a8b578cb2c92490fbfd3dfc632616017"} Apr 16 18:20:30.972657 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.972464 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" event={"ID":"04b00891-8f1a-4b47-bdb5-63b1933e788f","Type":"ContainerStarted","Data":"30533a8162fa3fba6809babeec72b0808dd43887c5f1f01f307be89f2ea0ed95"} Apr 16 18:20:30.973494 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:30.973471 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jm7p" event={"ID":"285a85b9-1863-44eb-9c99-d65ffce469c1","Type":"ContainerStarted","Data":"63c7a0dc59339539ebceabe4d34ad2b492132d3804e25827cef3e680233374a3"} Apr 16 18:20:31.105775 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.105740 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:31.108009 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.107966 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-rt4w5\" (UID: \"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:31.295235 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.295146 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" Apr 16 18:20:31.430257 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.430099 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-rt4w5"] Apr 16 18:20:31.519595 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:31.519563 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d33d8d2_1b22_4a2b_bdef_ebdfa8fdc579.slice/crio-659216e03c4bb423dadeb67e0850774ab790d976b9b2cba3e2ea86bdbd7e9f96 WatchSource:0}: Error finding container 659216e03c4bb423dadeb67e0850774ab790d976b9b2cba3e2ea86bdbd7e9f96: Status 404 returned error can't find the container with id 659216e03c4bb423dadeb67e0850774ab790d976b9b2cba3e2ea86bdbd7e9f96 Apr 16 18:20:31.978645 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.978598 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" event={"ID":"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579","Type":"ContainerStarted","Data":"659216e03c4bb423dadeb67e0850774ab790d976b9b2cba3e2ea86bdbd7e9f96"} Apr 16 18:20:31.979956 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.979932 2566 generic.go:358] "Generic (PLEG): container finished" podID="285a85b9-1863-44eb-9c99-d65ffce469c1" containerID="4d7f4ed2866b8accaf4a69fc23e2fad1f9c9535ac10d241ad352d924c3137548" exitCode=0 Apr 16 18:20:31.980093 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:31.979970 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jm7p" event={"ID":"285a85b9-1863-44eb-9c99-d65ffce469c1","Type":"ContainerDied","Data":"4d7f4ed2866b8accaf4a69fc23e2fad1f9c9535ac10d241ad352d924c3137548"} Apr 16 18:20:32.392586 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.392551 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-67d4b96bc7-zhxww"] Apr 16 18:20:32.396193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.396171 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.399077 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.398844 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 18:20:32.399077 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.398955 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6khgpsac070p0\"" Apr 16 18:20:32.399077 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.398955 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 18:20:32.399077 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.399054 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-bqsrc\"" Apr 16 18:20:32.399337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.399097 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 18:20:32.399337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.399179 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 18:20:32.399337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.399192 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 18:20:32.410606 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.410587 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67d4b96bc7-zhxww"] Apr 16 18:20:32.518812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.518781 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.518812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.518812 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-678xk\" (UniqueName: \"kubernetes.io/projected/20d27a74-6f2b-4530-8101-ef28c10a70a7-kube-api-access-678xk\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.518872 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.518966 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.519022 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d27a74-6f2b-4530-8101-ef28c10a70a7-metrics-client-ca\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.519047 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-grpc-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.519084 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.519218 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.519124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.619801 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.619766 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.619935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.619817 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.619935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.619902 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.619935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.619928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-678xk\" (UniqueName: \"kubernetes.io/projected/20d27a74-6f2b-4530-8101-ef28c10a70a7-kube-api-access-678xk\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.620117 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.619969 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.620117 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.620049 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.620117 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.620084 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d27a74-6f2b-4530-8101-ef28c10a70a7-metrics-client-ca\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.620117 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.620111 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-grpc-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.621021 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.620908 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20d27a74-6f2b-4530-8101-ef28c10a70a7-metrics-client-ca\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.622837 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.622808 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.623140 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.623088 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.623244 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.623194 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.623244 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.623204 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.623443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.623421 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-grpc-tls\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.623618 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.623589 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/20d27a74-6f2b-4530-8101-ef28c10a70a7-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.641948 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.641862 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-678xk\" (UniqueName: \"kubernetes.io/projected/20d27a74-6f2b-4530-8101-ef28c10a70a7-kube-api-access-678xk\") pod \"thanos-querier-67d4b96bc7-zhxww\" (UID: \"20d27a74-6f2b-4530-8101-ef28c10a70a7\") " pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.708202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.708099 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:32.986355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.986306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jm7p" event={"ID":"285a85b9-1863-44eb-9c99-d65ffce469c1","Type":"ContainerStarted","Data":"06c31a6a4ed9ca4fbca079e08e972517c2a8463895c83a8c137a185c8d5e4bfc"} Apr 16 18:20:32.986355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.986346 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-8jm7p" event={"ID":"285a85b9-1863-44eb-9c99-d65ffce469c1","Type":"ContainerStarted","Data":"526dc04d57b5b7975993c2c0c39b77a36a2e1d01841a147ca3e2096b381f3462"} Apr 16 18:20:32.988474 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:32.988447 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" event={"ID":"04b00891-8f1a-4b47-bdb5-63b1933e788f","Type":"ContainerStarted","Data":"1b9dbc66a13b5dde936522cb322fa7c4b6395a3f78ea56d125d6e5a5bc100f4b"} Apr 16 18:20:33.008201 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.008159 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-8jm7p" podStartSLOduration=2.175524727 podStartE2EDuration="3.008145212s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:30.735224261 +0000 UTC m=+206.827512214" lastFinishedPulling="2026-04-16 18:20:31.567844736 +0000 UTC m=+207.660132699" observedRunningTime="2026-04-16 18:20:33.007019203 +0000 UTC m=+209.099307169" watchObservedRunningTime="2026-04-16 18:20:33.008145212 +0000 UTC m=+209.100433168" Apr 16 18:20:33.014920 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.014812 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-67d4b96bc7-zhxww"] Apr 16 18:20:33.017894 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:33.017873 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20d27a74_6f2b_4530_8101_ef28c10a70a7.slice/crio-27de53fcc48b05cfdf44f284fa75f8eda98dd01aeae53576e4798968ebf37ecf WatchSource:0}: Error finding container 27de53fcc48b05cfdf44f284fa75f8eda98dd01aeae53576e4798968ebf37ecf: Status 404 returned error can't find the container with id 27de53fcc48b05cfdf44f284fa75f8eda98dd01aeae53576e4798968ebf37ecf Apr 16 18:20:33.033035 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.032648 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-z79hs" podStartSLOduration=1.950710243 podStartE2EDuration="3.032630427s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:30.899904246 +0000 UTC m=+206.992192192" lastFinishedPulling="2026-04-16 18:20:31.981824427 +0000 UTC m=+208.074112376" observedRunningTime="2026-04-16 18:20:33.032084212 +0000 UTC m=+209.124372190" watchObservedRunningTime="2026-04-16 18:20:33.032630427 +0000 UTC m=+209.124918393" Apr 16 18:20:33.993437 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.993396 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" event={"ID":"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579","Type":"ContainerStarted","Data":"69b028d5aabdd151272fb7ed65a50c6f3ce9e216cc86bbe57bd39b4dfc3e1513"} Apr 16 18:20:33.993437 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.993439 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" event={"ID":"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579","Type":"ContainerStarted","Data":"cd6021fd3d22c6eca01e8b9bb45710e3afe3fdc974340f737dffb5fbcd68f6e9"} Apr 16 18:20:33.993936 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.993454 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" event={"ID":"4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579","Type":"ContainerStarted","Data":"3a6e19328b3b6691446511788f32365d93b17cf74edc75315c5d722d2a455470"} Apr 16 18:20:33.994557 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:33.994528 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"27de53fcc48b05cfdf44f284fa75f8eda98dd01aeae53576e4798968ebf37ecf"} Apr 16 18:20:34.474953 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:34.474903 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-rt4w5" podStartSLOduration=3.055579459 podStartE2EDuration="4.47488907s" podCreationTimestamp="2026-04-16 18:20:30 +0000 UTC" firstStartedPulling="2026-04-16 18:20:31.523588929 +0000 UTC m=+207.615876873" lastFinishedPulling="2026-04-16 18:20:32.942898538 +0000 UTC m=+209.035186484" observedRunningTime="2026-04-16 18:20:34.018363256 +0000 UTC m=+210.110651222" watchObservedRunningTime="2026-04-16 18:20:34.47488907 +0000 UTC m=+210.567177159" Apr 16 18:20:34.998362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:34.998338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"0e67581ead6265e3515e5fa8cf79a514fb4f2be2abfcdb11f959e166011494a9"} Apr 16 18:20:35.611179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.611151 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6f476949-898w5"] Apr 16 18:20:35.614342 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.614326 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.616930 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.616903 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 18:20:35.617076 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.616953 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 18:20:35.617076 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.616987 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 18:20:35.617253 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.617239 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-x6hfq\"" Apr 16 18:20:35.617321 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.617310 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 18:20:35.617433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.617419 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 18:20:35.624193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.624175 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 18:20:35.627857 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.627835 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f476949-898w5"] Apr 16 18:20:35.750035 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750008 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-federate-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750169 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750066 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750283 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750211 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750283 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750249 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750409 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750389 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-metrics-client-ca\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750432 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-serving-certs-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750465 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6pjr\" (UniqueName: \"kubernetes.io/projected/d4f8ae75-cc0c-49bd-9485-0229eb626e51-kube-api-access-c6pjr\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.750549 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.750520 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851377 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-federate-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851465 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851548 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851575 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851682 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-metrics-client-ca\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851722 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-serving-certs-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851750 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c6pjr\" (UniqueName: \"kubernetes.io/projected/d4f8ae75-cc0c-49bd-9485-0229eb626e51-kube-api-access-c6pjr\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.852393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.851796 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.857538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.856520 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.857538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.856900 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-serving-certs-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.857538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.857360 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-metrics-client-ca\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.857538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.857478 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.857984 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.857962 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-federate-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.858888 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.858846 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.859633 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.859615 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d4f8ae75-cc0c-49bd-9485-0229eb626e51-telemeter-client-tls\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.866291 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.866234 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6pjr\" (UniqueName: \"kubernetes.io/projected/d4f8ae75-cc0c-49bd-9485-0229eb626e51-kube-api-access-c6pjr\") pod \"telemeter-client-6f476949-898w5\" (UID: \"d4f8ae75-cc0c-49bd-9485-0229eb626e51\") " pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:35.923594 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:35.923560 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6f476949-898w5" Apr 16 18:20:36.003210 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.003176 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"7bed5fb9a7d8e1813939b111ead2ff5f88e83c57884a5b74d6402d43ac7fee8e"} Apr 16 18:20:36.003210 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.003213 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"33f19ea6ae691906d87f94b3313751b3fa1cdbd156695169756190f0cba5258f"} Apr 16 18:20:36.116786 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.116706 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6f476949-898w5"] Apr 16 18:20:36.120845 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:20:36.120818 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4f8ae75_cc0c_49bd_9485_0229eb626e51.slice/crio-72e5f30e5e2b0f5a52317e6b11d584c7b369d4ba495193f828677cdffbfff47f WatchSource:0}: Error finding container 72e5f30e5e2b0f5a52317e6b11d584c7b369d4ba495193f828677cdffbfff47f: Status 404 returned error can't find the container with id 72e5f30e5e2b0f5a52317e6b11d584c7b369d4ba495193f828677cdffbfff47f Apr 16 18:20:36.650927 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.650891 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:36.655114 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.655085 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.659459 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.659432 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:20:36.659570 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.659502 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:20:36.659631 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.659577 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.661396 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.661644 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.661824 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.662134 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.662336 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.662526 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.662731 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:20:36.663233 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.662948 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2mefd2br9knqj\"" Apr 16 18:20:36.663649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.663279 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:20:36.663649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.663339 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-985hc\"" Apr 16 18:20:36.663649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.663413 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:20:36.668935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.668918 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:20:36.677308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.677284 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:36.761730 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761630 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.761730 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761686 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.761955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761792 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.761955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761823 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stn9v\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.761955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761915 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.761955 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.761947 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762042 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762082 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762115 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762135 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762202 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762169 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762213 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762260 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762331 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762368 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762395 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762423 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.762441 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.762440 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.863588 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.863480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.863588 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.863530 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.863588 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.863551 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.863588 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.863578 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864018 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.863977 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864106 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864046 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864154 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864109 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864154 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864200 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864183 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864208 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864247 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864274 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864317 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864354 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864415 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864440 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-stn9v\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864477 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.864948 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.864853 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.868050 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.867694 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.868338 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.868305 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.868553 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.868876 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.869016 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.869211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869789 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.869560 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.869789 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.869692 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.870455 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.870426 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.870598 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.870549 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.871096 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.871066 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.871510 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.871467 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.871812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.871786 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.871812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.871810 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.872670 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.872650 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.873901 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.873874 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.878074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.878054 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-stn9v\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v\") pod \"prometheus-k8s-0\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:36.971163 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:36.970909 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:20:37.008478 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.008432 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f476949-898w5" event={"ID":"d4f8ae75-cc0c-49bd-9485-0229eb626e51","Type":"ContainerStarted","Data":"72e5f30e5e2b0f5a52317e6b11d584c7b369d4ba495193f828677cdffbfff47f"} Apr 16 18:20:37.012052 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.011977 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"53be86d9b6317b72a95f8055711206316711c7c63ddf08e60f2fcc05d1392284"} Apr 16 18:20:37.012052 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.012040 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"fa0fc975cc7dfbd204740e0de07a7258c40353b00b307fa9e313c33e2e4f55c7"} Apr 16 18:20:37.012052 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.012055 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" event={"ID":"20d27a74-6f2b-4530-8101-ef28c10a70a7","Type":"ContainerStarted","Data":"ecd21f925a08360a055cb89cf62c08fc13961c427c9a28136109c90826e6162b"} Apr 16 18:20:37.012342 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.012175 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:37.037857 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.037636 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" podStartSLOduration=2.036351732 podStartE2EDuration="5.037616639s" podCreationTimestamp="2026-04-16 18:20:32 +0000 UTC" firstStartedPulling="2026-04-16 18:20:33.020918738 +0000 UTC m=+209.113206684" lastFinishedPulling="2026-04-16 18:20:36.022183645 +0000 UTC m=+212.114471591" observedRunningTime="2026-04-16 18:20:37.037429122 +0000 UTC m=+213.129717114" watchObservedRunningTime="2026-04-16 18:20:37.037616639 +0000 UTC m=+213.129904605" Apr 16 18:20:37.128270 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.128242 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:20:37.163212 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:37.163176 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9bcb99cb4-2lzkn"] Apr 16 18:20:37.163462 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:20:37.163442 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" podUID="19763774-6747-497c-913a-f8852a4e5a0d" Apr 16 18:20:38.016856 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.016813 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"d3348fda51b7eba8f05f6886dcbf32f425f856b178fb3e971c4a154b79b4af20"} Apr 16 18:20:38.018240 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.018205 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f476949-898w5" event={"ID":"d4f8ae75-cc0c-49bd-9485-0229eb626e51","Type":"ContainerStarted","Data":"e7f65c04c1c71301028782611e4e332187ea317ec0237aaab1e44c67bf1754e1"} Apr 16 18:20:38.018347 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.018265 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:20:38.023208 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.023153 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:20:38.178141 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178112 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178310 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178166 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c8vlp\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178310 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178216 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178310 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178291 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178473 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178315 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178473 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178342 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178473 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178367 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets\") pod \"19763774-6747-497c-913a-f8852a4e5a0d\" (UID: \"19763774-6747-497c-913a-f8852a4e5a0d\") " Apr 16 18:20:38.178688 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178662 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:20:38.179074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178797 2566 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/19763774-6747-497c-913a-f8852a4e5a0d-ca-trust-extracted\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.179074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178818 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:38.179074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.178982 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:20:38.181211 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.181169 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:38.181541 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.181502 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:20:38.181962 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.181937 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:38.182296 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.182270 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp" (OuterVolumeSpecName: "kube-api-access-c8vlp") pod "19763774-6747-497c-913a-f8852a4e5a0d" (UID: "19763774-6747-497c-913a-f8852a4e5a0d"). InnerVolumeSpecName "kube-api-access-c8vlp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:20:38.280308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280222 2566 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-bound-sa-token\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.280308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280258 2566 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-trusted-ca\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.280308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280274 2566 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/19763774-6747-497c-913a-f8852a4e5a0d-registry-certificates\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.280308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280288 2566 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-installation-pull-secrets\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.280308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280304 2566 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/19763774-6747-497c-913a-f8852a4e5a0d-image-registry-private-configuration\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:38.280611 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:38.280320 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c8vlp\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-kube-api-access-c8vlp\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:39.023144 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.023103 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f476949-898w5" event={"ID":"d4f8ae75-cc0c-49bd-9485-0229eb626e51","Type":"ContainerStarted","Data":"f6c3ebcc71e60758f0403bd8d639a14b4255382840b42f2720e99a7e71513943"} Apr 16 18:20:39.023144 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.023145 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6f476949-898w5" event={"ID":"d4f8ae75-cc0c-49bd-9485-0229eb626e51","Type":"ContainerStarted","Data":"739b9cb3ec7ec7953f17100dd06d1fbd7ea6879e941fe063de4b59f83f4acbf2"} Apr 16 18:20:39.024419 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.024392 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="05bfca08760bed2d7a4caa3682e18b8d73f5cb514b1ecf2cc2f1d231265a3565" exitCode=0 Apr 16 18:20:39.024528 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.024458 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9bcb99cb4-2lzkn" Apr 16 18:20:39.024528 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.024469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"05bfca08760bed2d7a4caa3682e18b8d73f5cb514b1ecf2cc2f1d231265a3565"} Apr 16 18:20:39.050517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.050465 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6f476949-898w5" podStartSLOduration=1.6835713829999999 podStartE2EDuration="4.050450211s" podCreationTimestamp="2026-04-16 18:20:35 +0000 UTC" firstStartedPulling="2026-04-16 18:20:36.124550208 +0000 UTC m=+212.216838154" lastFinishedPulling="2026-04-16 18:20:38.491429039 +0000 UTC m=+214.583716982" observedRunningTime="2026-04-16 18:20:39.050027666 +0000 UTC m=+215.142315629" watchObservedRunningTime="2026-04-16 18:20:39.050450211 +0000 UTC m=+215.142738167" Apr 16 18:20:39.096323 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.096298 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-9bcb99cb4-2lzkn"] Apr 16 18:20:39.100832 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.100808 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-9bcb99cb4-2lzkn"] Apr 16 18:20:39.192820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:39.192787 2566 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/19763774-6747-497c-913a-f8852a4e5a0d-registry-tls\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:20:40.452122 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:40.452088 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="19763774-6747-497c-913a-f8852a4e5a0d" path="/var/lib/kubelet/pods/19763774-6747-497c-913a-f8852a4e5a0d/volumes" Apr 16 18:20:43.024374 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.024347 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-67d4b96bc7-zhxww" Apr 16 18:20:43.039574 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039551 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"5522cb4181894dbbed678c48874a7b6607bc9ecd8b391203a4e336dc985cbd9a"} Apr 16 18:20:43.039703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039578 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"cc9fd38750bf43cf3a56255115d190a9576bc51c118f4fbdf029576dfc44ef0d"} Apr 16 18:20:43.039703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039589 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"37d1ae52e0af602103a83e031f399bcf36b0d95edcf20570d3561e07c7ea1209"} Apr 16 18:20:43.039703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039598 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"34878a66061d453a64fa2bfdcbc56128c06bc5d91564df75d24b080746932681"} Apr 16 18:20:43.039703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039605 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"91aa75d83a8b8b588dd369a1aa2a8399987c635d8740b0d9d1a4323327513f89"} Apr 16 18:20:43.039703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.039615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerStarted","Data":"9858b22b3f558f508927b04e2e1f738ff1a4cd9f660877d69f0862691437df4f"} Apr 16 18:20:43.080733 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:43.080683 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=1.982988582 podStartE2EDuration="7.080667394s" podCreationTimestamp="2026-04-16 18:20:36 +0000 UTC" firstStartedPulling="2026-04-16 18:20:37.133856857 +0000 UTC m=+213.226144800" lastFinishedPulling="2026-04-16 18:20:42.231535666 +0000 UTC m=+218.323823612" observedRunningTime="2026-04-16 18:20:43.078824253 +0000 UTC m=+219.171112218" watchObservedRunningTime="2026-04-16 18:20:43.080667394 +0000 UTC m=+219.172955360" Apr 16 18:20:46.971722 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:20:46.971682 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:16.216965 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:16.216928 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:21:16.219633 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:16.219605 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bd001d43-c6f4-44f4-906e-c01f02068004-metrics-certs\") pod \"network-metrics-daemon-jj9db\" (UID: \"bd001d43-c6f4-44f4-906e-c01f02068004\") " pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:21:16.250968 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:16.250944 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-8x9mw\"" Apr 16 18:21:16.259041 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:16.259019 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jj9db" Apr 16 18:21:16.380222 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:16.380196 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jj9db"] Apr 16 18:21:16.383252 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:21:16.383224 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd001d43_c6f4_44f4_906e_c01f02068004.slice/crio-faaf794f18becf787d3bd2aaebc33bef44c02c29ab4f4e7e62d18f72c9508dc7 WatchSource:0}: Error finding container faaf794f18becf787d3bd2aaebc33bef44c02c29ab4f4e7e62d18f72c9508dc7: Status 404 returned error can't find the container with id faaf794f18becf787d3bd2aaebc33bef44c02c29ab4f4e7e62d18f72c9508dc7 Apr 16 18:21:17.010041 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:17.010009 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5psgq_125ba5ab-da90-4b8d-b93b-56e647e63aff/dns-node-resolver/0.log" Apr 16 18:21:17.132041 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:17.131985 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jj9db" event={"ID":"bd001d43-c6f4-44f4-906e-c01f02068004","Type":"ContainerStarted","Data":"faaf794f18becf787d3bd2aaebc33bef44c02c29ab4f4e7e62d18f72c9508dc7"} Apr 16 18:21:18.135921 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:18.135886 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jj9db" event={"ID":"bd001d43-c6f4-44f4-906e-c01f02068004","Type":"ContainerStarted","Data":"4d6364c20cfebee80265494341a9396529285a6f8137b0aa6bd69299b6c6176d"} Apr 16 18:21:18.135921 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:18.135920 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jj9db" event={"ID":"bd001d43-c6f4-44f4-906e-c01f02068004","Type":"ContainerStarted","Data":"e4d9ef69e69f6a6a99cecf13c32b26f90bde1cd497ec7e4232519a1207cc16bd"} Apr 16 18:21:18.156301 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:18.156257 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jj9db" podStartSLOduration=253.220135448 podStartE2EDuration="4m14.156243019s" podCreationTimestamp="2026-04-16 18:17:04 +0000 UTC" firstStartedPulling="2026-04-16 18:21:16.385511295 +0000 UTC m=+252.477799237" lastFinishedPulling="2026-04-16 18:21:17.321618865 +0000 UTC m=+253.413906808" observedRunningTime="2026-04-16 18:21:18.154977926 +0000 UTC m=+254.247265892" watchObservedRunningTime="2026-04-16 18:21:18.156243019 +0000 UTC m=+254.248530984" Apr 16 18:21:36.971561 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:36.971461 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:36.989906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:36.989872 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:37.208641 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:37.208614 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:42.868417 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:21:42.868373 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" podUID="c7de8316-4440-4039-ae31-310a6c1146a9" Apr 16 18:21:42.868417 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:21:42.868374 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2zc2n" podUID="4a945a62-4bc4-4f09-8555-50569018d9ac" Apr 16 18:21:43.210869 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:43.210794 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:21:43.211026 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:43.210794 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:21:46.372303 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.372262 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:21:46.372781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.372324 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:21:46.372781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.372350 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:21:46.374693 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.374664 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a945a62-4bc4-4f09-8555-50569018d9ac-metrics-tls\") pod \"dns-default-2zc2n\" (UID: \"4a945a62-4bc4-4f09-8555-50569018d9ac\") " pod="openshift-dns/dns-default-2zc2n" Apr 16 18:21:46.374805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.374764 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c7de8316-4440-4039-ae31-310a6c1146a9-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-r2ztm\" (UID: \"c7de8316-4440-4039-ae31-310a6c1146a9\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:21:46.374926 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.374906 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6b838b63-87c9-46b4-96a1-ed246b230c36-cert\") pod \"ingress-canary-cpbgb\" (UID: \"6b838b63-87c9-46b4-96a1-ed246b230c36\") " pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:21:46.451508 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.451481 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-v2cgj\"" Apr 16 18:21:46.459642 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.459624 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cpbgb" Apr 16 18:21:46.513602 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.513577 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-m4v2s\"" Apr 16 18:21:46.513750 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.513725 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-nk7cg\"" Apr 16 18:21:46.522377 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.522357 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:21:46.522527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.522505 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" Apr 16 18:21:46.586628 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.586559 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cpbgb"] Apr 16 18:21:46.591351 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:21:46.591320 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b838b63_87c9_46b4_96a1_ed246b230c36.slice/crio-f44fd362ba9c17ed3c1c4bb954529fdb44dd16537338c333384ffc98656b9bd4 WatchSource:0}: Error finding container f44fd362ba9c17ed3c1c4bb954529fdb44dd16537338c333384ffc98656b9bd4: Status 404 returned error can't find the container with id f44fd362ba9c17ed3c1c4bb954529fdb44dd16537338c333384ffc98656b9bd4 Apr 16 18:21:46.657076 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.657055 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2zc2n"] Apr 16 18:21:46.658882 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:21:46.658856 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a945a62_4bc4_4f09_8555_50569018d9ac.slice/crio-2357451c243a43a35474d0ddf66382f2c094893b7b8456da89163fcdac7a1322 WatchSource:0}: Error finding container 2357451c243a43a35474d0ddf66382f2c094893b7b8456da89163fcdac7a1322: Status 404 returned error can't find the container with id 2357451c243a43a35474d0ddf66382f2c094893b7b8456da89163fcdac7a1322 Apr 16 18:21:46.672835 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:46.672813 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm"] Apr 16 18:21:46.674543 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:21:46.674520 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7de8316_4440_4039_ae31_310a6c1146a9.slice/crio-58d5b6cc9ef2e4b2e2546f1d0503b64d7f038b4cf2d5de799e7fbfad94601de6 WatchSource:0}: Error finding container 58d5b6cc9ef2e4b2e2546f1d0503b64d7f038b4cf2d5de799e7fbfad94601de6: Status 404 returned error can't find the container with id 58d5b6cc9ef2e4b2e2546f1d0503b64d7f038b4cf2d5de799e7fbfad94601de6 Apr 16 18:21:47.224087 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:47.224025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" event={"ID":"c7de8316-4440-4039-ae31-310a6c1146a9","Type":"ContainerStarted","Data":"58d5b6cc9ef2e4b2e2546f1d0503b64d7f038b4cf2d5de799e7fbfad94601de6"} Apr 16 18:21:47.225786 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:47.225737 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2zc2n" event={"ID":"4a945a62-4bc4-4f09-8555-50569018d9ac","Type":"ContainerStarted","Data":"2357451c243a43a35474d0ddf66382f2c094893b7b8456da89163fcdac7a1322"} Apr 16 18:21:47.227189 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:47.227169 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cpbgb" event={"ID":"6b838b63-87c9-46b4-96a1-ed246b230c36","Type":"ContainerStarted","Data":"f44fd362ba9c17ed3c1c4bb954529fdb44dd16537338c333384ffc98656b9bd4"} Apr 16 18:21:49.234182 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.234089 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" event={"ID":"c7de8316-4440-4039-ae31-310a6c1146a9","Type":"ContainerStarted","Data":"0de520fe4e128e5e496fdf10448bb5a6135c97cbdfe0801935bc8a7c69948eac"} Apr 16 18:21:49.235639 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.235613 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2zc2n" event={"ID":"4a945a62-4bc4-4f09-8555-50569018d9ac","Type":"ContainerStarted","Data":"6d181c3814beb1c25d6383358d9f73d0e5b926bb1f3b677c3379022a9bd39ebd"} Apr 16 18:21:49.235639 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.235639 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2zc2n" event={"ID":"4a945a62-4bc4-4f09-8555-50569018d9ac","Type":"ContainerStarted","Data":"a4670152821fae77b43cf5f5d127f27b9b3afc69014876b25ef71bfae3d87a7a"} Apr 16 18:21:49.235810 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.235735 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:21:49.236838 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.236818 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cpbgb" event={"ID":"6b838b63-87c9-46b4-96a1-ed246b230c36","Type":"ContainerStarted","Data":"0743e11ad8862a47b9a292ded2e87f1fd3d0dc5432b8964ec9e9d07cfb7d1f0c"} Apr 16 18:21:49.252395 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.252352 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-r2ztm" podStartSLOduration=259.327118861 podStartE2EDuration="4m21.252340548s" podCreationTimestamp="2026-04-16 18:17:28 +0000 UTC" firstStartedPulling="2026-04-16 18:21:46.67790342 +0000 UTC m=+282.770191364" lastFinishedPulling="2026-04-16 18:21:48.603125094 +0000 UTC m=+284.695413051" observedRunningTime="2026-04-16 18:21:49.251514321 +0000 UTC m=+285.343802286" watchObservedRunningTime="2026-04-16 18:21:49.252340548 +0000 UTC m=+285.344628513" Apr 16 18:21:49.270238 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.270192 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2zc2n" podStartSLOduration=251.323859341 podStartE2EDuration="4m13.270176393s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:21:46.660701101 +0000 UTC m=+282.752989047" lastFinishedPulling="2026-04-16 18:21:48.607018141 +0000 UTC m=+284.699306099" observedRunningTime="2026-04-16 18:21:49.269492805 +0000 UTC m=+285.361780770" watchObservedRunningTime="2026-04-16 18:21:49.270176393 +0000 UTC m=+285.362464359" Apr 16 18:21:49.285663 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:49.285621 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cpbgb" podStartSLOduration=251.272282473 podStartE2EDuration="4m13.285609137s" podCreationTimestamp="2026-04-16 18:17:36 +0000 UTC" firstStartedPulling="2026-04-16 18:21:46.594405328 +0000 UTC m=+282.686693273" lastFinishedPulling="2026-04-16 18:21:48.60773198 +0000 UTC m=+284.700019937" observedRunningTime="2026-04-16 18:21:49.284306542 +0000 UTC m=+285.376594508" watchObservedRunningTime="2026-04-16 18:21:49.285609137 +0000 UTC m=+285.377897132" Apr 16 18:21:55.077351 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077314 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:55.077900 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077774 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="prometheus" containerID="cri-o://9858b22b3f558f508927b04e2e1f738ff1a4cd9f660877d69f0862691437df4f" gracePeriod=600 Apr 16 18:21:55.077900 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077809 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="config-reloader" containerID="cri-o://91aa75d83a8b8b588dd369a1aa2a8399987c635d8740b0d9d1a4323327513f89" gracePeriod=600 Apr 16 18:21:55.077900 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077868 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5522cb4181894dbbed678c48874a7b6607bc9ecd8b391203a4e336dc985cbd9a" gracePeriod=600 Apr 16 18:21:55.077900 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077815 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy" containerID="cri-o://cc9fd38750bf43cf3a56255115d190a9576bc51c118f4fbdf029576dfc44ef0d" gracePeriod=600 Apr 16 18:21:55.078215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077821 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-web" containerID="cri-o://37d1ae52e0af602103a83e031f399bcf36b0d95edcf20570d3561e07c7ea1209" gracePeriod=600 Apr 16 18:21:55.078215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.077823 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="thanos-sidecar" containerID="cri-o://34878a66061d453a64fa2bfdcbc56128c06bc5d91564df75d24b080746932681" gracePeriod=600 Apr 16 18:21:55.257127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257076 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="5522cb4181894dbbed678c48874a7b6607bc9ecd8b391203a4e336dc985cbd9a" exitCode=0 Apr 16 18:21:55.257127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257107 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="cc9fd38750bf43cf3a56255115d190a9576bc51c118f4fbdf029576dfc44ef0d" exitCode=0 Apr 16 18:21:55.257127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257124 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="37d1ae52e0af602103a83e031f399bcf36b0d95edcf20570d3561e07c7ea1209" exitCode=0 Apr 16 18:21:55.257127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257136 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="34878a66061d453a64fa2bfdcbc56128c06bc5d91564df75d24b080746932681" exitCode=0 Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257144 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="91aa75d83a8b8b588dd369a1aa2a8399987c635d8740b0d9d1a4323327513f89" exitCode=0 Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257153 2566 generic.go:358] "Generic (PLEG): container finished" podID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerID="9858b22b3f558f508927b04e2e1f738ff1a4cd9f660877d69f0862691437df4f" exitCode=0 Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257150 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"5522cb4181894dbbed678c48874a7b6607bc9ecd8b391203a4e336dc985cbd9a"} Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257196 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"cc9fd38750bf43cf3a56255115d190a9576bc51c118f4fbdf029576dfc44ef0d"} Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257214 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"37d1ae52e0af602103a83e031f399bcf36b0d95edcf20570d3561e07c7ea1209"} Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257229 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"34878a66061d453a64fa2bfdcbc56128c06bc5d91564df75d24b080746932681"} Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257245 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"91aa75d83a8b8b588dd369a1aa2a8399987c635d8740b0d9d1a4323327513f89"} Apr 16 18:21:55.257425 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.257259 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"9858b22b3f558f508927b04e2e1f738ff1a4cd9f660877d69f0862691437df4f"} Apr 16 18:21:55.315345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.315319 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:55.459069 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459028 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459069 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459075 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459098 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459122 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459139 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459164 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459194 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-stn9v\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459220 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459274 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459305 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459340 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459373 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459405 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459433 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459465 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459502 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459535 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459546 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:55.459717 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459565 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle\") pod \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\" (UID: \"6b0fe202-b329-42dd-bb5c-47bd8a8160ed\") " Apr 16 18:21:55.460191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.459826 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.460191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.460152 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:55.461555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.461084 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:55.461555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.461363 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:55.461908 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.461882 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out" (OuterVolumeSpecName: "config-out") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:21:55.461983 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.461962 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.462067 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.462047 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.462223 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.462186 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config" (OuterVolumeSpecName: "config") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.462460 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.462438 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.462737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.462713 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:55.463433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.463389 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:55.464136 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.464109 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.464221 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.464172 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v" (OuterVolumeSpecName: "kube-api-access-stn9v") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "kube-api-access-stn9v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:21:55.464221 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.464180 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.464392 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.464370 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:21:55.464986 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.464969 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.465149 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.465131 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.474168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.474148 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config" (OuterVolumeSpecName: "web-config") pod "6b0fe202-b329-42dd-bb5c-47bd8a8160ed" (UID: "6b0fe202-b329-42dd-bb5c-47bd8a8160ed"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 18:21:55.560342 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560321 2566 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config-out\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560342 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560342 2566 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-grpc-tls\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560352 2566 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-config\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560361 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560372 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560382 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-stn9v\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-kube-api-access-stn9v\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560391 2566 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-thanos-prometheus-http-client-file\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560400 2566 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-tls-assets\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560409 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-metrics-client-ca\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560418 2566 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-prometheus-k8s-tls\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560427 2566 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-web-config\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560436 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-db\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560443 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560749 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560452 2566 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-metrics-client-certs\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560749 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560462 2566 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560749 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560472 2566 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-secret-kube-rbac-proxy\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:55.560749 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:55.560481 2566 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6b0fe202-b329-42dd-bb5c-47bd8a8160ed-prometheus-trusted-ca-bundle\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:21:56.263258 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.263219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6b0fe202-b329-42dd-bb5c-47bd8a8160ed","Type":"ContainerDied","Data":"d3348fda51b7eba8f05f6886dcbf32f425f856b178fb3e971c4a154b79b4af20"} Apr 16 18:21:56.263630 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.263277 2566 scope.go:117] "RemoveContainer" containerID="5522cb4181894dbbed678c48874a7b6607bc9ecd8b391203a4e336dc985cbd9a" Apr 16 18:21:56.263630 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.263378 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.270532 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.270507 2566 scope.go:117] "RemoveContainer" containerID="cc9fd38750bf43cf3a56255115d190a9576bc51c118f4fbdf029576dfc44ef0d" Apr 16 18:21:56.277467 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.277449 2566 scope.go:117] "RemoveContainer" containerID="37d1ae52e0af602103a83e031f399bcf36b0d95edcf20570d3561e07c7ea1209" Apr 16 18:21:56.283581 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.283563 2566 scope.go:117] "RemoveContainer" containerID="34878a66061d453a64fa2bfdcbc56128c06bc5d91564df75d24b080746932681" Apr 16 18:21:56.289984 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.289966 2566 scope.go:117] "RemoveContainer" containerID="91aa75d83a8b8b588dd369a1aa2a8399987c635d8740b0d9d1a4323327513f89" Apr 16 18:21:56.290830 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.290807 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:56.295649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.295626 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:56.296962 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.296945 2566 scope.go:117] "RemoveContainer" containerID="9858b22b3f558f508927b04e2e1f738ff1a4cd9f660877d69f0862691437df4f" Apr 16 18:21:56.303848 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.303834 2566 scope.go:117] "RemoveContainer" containerID="05bfca08760bed2d7a4caa3682e18b8d73f5cb514b1ecf2cc2f1d231265a3565" Apr 16 18:21:56.324232 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324213 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:56.324524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324511 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324526 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324536 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="prometheus" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324541 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="prometheus" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324549 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-web" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324554 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-web" Apr 16 18:21:56.324568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324566 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="config-reloader" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324571 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="config-reloader" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324579 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="init-config-reloader" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324584 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="init-config-reloader" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324591 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-thanos" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324596 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-thanos" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324604 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="thanos-sidecar" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324609 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="thanos-sidecar" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324652 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="config-reloader" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324659 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-thanos" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324666 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="thanos-sidecar" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324674 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy-web" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324682 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="prometheus" Apr 16 18:21:56.324762 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.324688 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" containerName="kube-rbac-proxy" Apr 16 18:21:56.328601 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.328583 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.331331 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331313 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2mefd2br9knqj\"" Apr 16 18:21:56.331471 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331452 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 18:21:56.331519 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331461 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-985hc\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331711 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331724 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331757 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331769 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331774 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331777 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 18:21:56.331879 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.331778 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 18:21:56.332215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.332162 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 18:21:56.332269 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.332163 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 18:21:56.332319 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.332164 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 18:21:56.334847 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.334828 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 18:21:56.338135 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.338118 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 18:21:56.341392 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.341374 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:56.452009 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.451909 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b0fe202-b329-42dd-bb5c-47bd8a8160ed" path="/var/lib/kubelet/pods/6b0fe202-b329-42dd-bb5c-47bd8a8160ed/volumes" Apr 16 18:21:56.467456 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467460 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467483 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467511 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467532 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467550 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467585 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467659 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467698 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znvbp\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-kube-api-access-znvbp\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467742 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467769 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467790 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.467805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467806 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.468070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467826 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.468070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467841 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.468070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467854 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.468070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467912 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.468070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.467934 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568790 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568759 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568797 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568819 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-znvbp\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-kube-api-access-znvbp\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568847 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568865 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568882 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568898 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568934 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.568972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.568958 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569480 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569519 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569538 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569572 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569610 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569639 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569676 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569709 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569733 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569764 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.570168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.569826 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.571706 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.571531 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572030 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.571849 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572030 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.571886 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572264 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.572211 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f5726efc-b17a-44d9-9703-efbe9a70152a-config-out\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572720 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.572408 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572720 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.572682 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.572938 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.572918 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.573736 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.573678 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.573962 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.573935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f5726efc-b17a-44d9-9703-efbe9a70152a-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.574123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.573935 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.574581 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.574555 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.574581 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.574574 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.574722 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.574620 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.574984 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.574960 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.575205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.575188 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f5726efc-b17a-44d9-9703-efbe9a70152a-web-config\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.578198 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.578177 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-znvbp\" (UniqueName: \"kubernetes.io/projected/f5726efc-b17a-44d9-9703-efbe9a70152a-kube-api-access-znvbp\") pod \"prometheus-k8s-0\" (UID: \"f5726efc-b17a-44d9-9703-efbe9a70152a\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.638739 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.638706 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:21:56.766908 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:56.766882 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 18:21:56.768575 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:21:56.768543 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5726efc_b17a_44d9_9703_efbe9a70152a.slice/crio-cf81fb29d0ca3215b85530aadd64caa2873549fa55d615667df312b8a369d373 WatchSource:0}: Error finding container cf81fb29d0ca3215b85530aadd64caa2873549fa55d615667df312b8a369d373: Status 404 returned error can't find the container with id cf81fb29d0ca3215b85530aadd64caa2873549fa55d615667df312b8a369d373 Apr 16 18:21:57.270537 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:57.270499 2566 generic.go:358] "Generic (PLEG): container finished" podID="f5726efc-b17a-44d9-9703-efbe9a70152a" containerID="9c058b0f03db69378c17d732b1cca9b68053c88445f5a6ad6095823876a8c4cc" exitCode=0 Apr 16 18:21:57.270934 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:57.270590 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerDied","Data":"9c058b0f03db69378c17d732b1cca9b68053c88445f5a6ad6095823876a8c4cc"} Apr 16 18:21:57.270934 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:57.270634 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"cf81fb29d0ca3215b85530aadd64caa2873549fa55d615667df312b8a369d373"} Apr 16 18:21:58.277689 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277650 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"f61526aa650029eb99797381d4c117f304c5cf6d39d25047d10e4bb795469837"} Apr 16 18:21:58.277689 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277696 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"0a58ec92812a42370abdb8a8bdcdea1801f5d9395323a93741c4528ce0b1ad8f"} Apr 16 18:21:58.278251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277712 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"6db46504343e01f9afc32a39996ba68dc4c7a20c92b39ed1ca2c974830dbf3dd"} Apr 16 18:21:58.278251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277725 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"e2a77a7d9791c163e8ea79ea5acd26cb0ccaf05590475149d58e185a9b66ec42"} Apr 16 18:21:58.278251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277737 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"f1325db94c3d57149317d0a4f70f09e3041ef0783b8c533e0a654d716b45397f"} Apr 16 18:21:58.278251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.277750 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f5726efc-b17a-44d9-9703-efbe9a70152a","Type":"ContainerStarted","Data":"4e803abe50d9bf0076aa0305cfcd6cb641fe50b35a2c840a0dabce3e023c3d4a"} Apr 16 18:21:58.310298 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:58.310251 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.310237502 podStartE2EDuration="2.310237502s" podCreationTimestamp="2026-04-16 18:21:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:21:58.307616709 +0000 UTC m=+294.399904674" watchObservedRunningTime="2026-04-16 18:21:58.310237502 +0000 UTC m=+294.402525467" Apr 16 18:21:59.241864 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:21:59.241834 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2zc2n" Apr 16 18:22:01.639216 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:22:01.639190 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:04.295196 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:22:04.295169 2566 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 18:22:56.639807 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:22:56.639697 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:56.655104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:22:56.655078 2566 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:22:57.460702 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:22:57.460677 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 18:27:30.997345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:30.997269 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmms8"] Apr 16 18:27:30.999428 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:30.999412 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.001966 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.001937 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 18:27:31.002988 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.002962 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 18:27:31.003128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.003043 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-zsgdg\"" Apr 16 18:27:31.003128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.003077 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 18:27:31.011032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.011010 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmms8"] Apr 16 18:27:31.020703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.020678 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-sxf6f"] Apr 16 18:27:31.023305 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.023286 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.025975 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.025954 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bqs8m\"" Apr 16 18:27:31.026455 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.026439 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 18:27:31.035535 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.035514 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sxf6f"] Apr 16 18:27:31.074797 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.074762 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vvsq\" (UniqueName: \"kubernetes.io/projected/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-kube-api-access-8vvsq\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.074797 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.074796 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-data\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.175394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.175363 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8vvsq\" (UniqueName: \"kubernetes.io/projected/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-kube-api-access-8vvsq\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.175394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.175397 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-data\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.175590 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.175430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9sh\" (UniqueName: \"kubernetes.io/projected/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-kube-api-access-kt9sh\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.175590 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.175464 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.175760 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.175743 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-data\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.185080 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.185059 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vvsq\" (UniqueName: \"kubernetes.io/projected/95a3dbb4-48a9-4b3e-9073-dd14bb5891f7-kube-api-access-8vvsq\") pod \"seaweedfs-86cc847c5c-sxf6f\" (UID: \"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7\") " pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.276771 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.276692 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9sh\" (UniqueName: \"kubernetes.io/projected/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-kube-api-access-kt9sh\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.276771 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.276731 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.278979 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.278956 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.286133 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.286112 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9sh\" (UniqueName: \"kubernetes.io/projected/08efaeba-fdcb-44d5-bec2-39a299e6eb3d-kube-api-access-kt9sh\") pod \"llmisvc-controller-manager-68cc5db7c4-tmms8\" (UID: \"08efaeba-fdcb-44d5-bec2-39a299e6eb3d\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.310946 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.310929 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:31.333732 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.333707 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:31.444124 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.443966 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-tmms8"] Apr 16 18:27:31.446822 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:27:31.446797 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod08efaeba_fdcb_44d5_bec2_39a299e6eb3d.slice/crio-61e6e00bceb866a688358872b43badd284f9e7059a25e51fd64beef907372cd3 WatchSource:0}: Error finding container 61e6e00bceb866a688358872b43badd284f9e7059a25e51fd64beef907372cd3: Status 404 returned error can't find the container with id 61e6e00bceb866a688358872b43badd284f9e7059a25e51fd64beef907372cd3 Apr 16 18:27:31.448174 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.448153 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:27:31.465046 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:31.465017 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-sxf6f"] Apr 16 18:27:31.467519 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:27:31.467495 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95a3dbb4_48a9_4b3e_9073_dd14bb5891f7.slice/crio-a481c03d3b10499afa3a3f5ce5b0ebdf4922c216805896d97cf3607087da1420 WatchSource:0}: Error finding container a481c03d3b10499afa3a3f5ce5b0ebdf4922c216805896d97cf3607087da1420: Status 404 returned error can't find the container with id a481c03d3b10499afa3a3f5ce5b0ebdf4922c216805896d97cf3607087da1420 Apr 16 18:27:32.217068 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:32.217029 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sxf6f" event={"ID":"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7","Type":"ContainerStarted","Data":"a481c03d3b10499afa3a3f5ce5b0ebdf4922c216805896d97cf3607087da1420"} Apr 16 18:27:32.218495 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:32.218454 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" event={"ID":"08efaeba-fdcb-44d5-bec2-39a299e6eb3d","Type":"ContainerStarted","Data":"61e6e00bceb866a688358872b43badd284f9e7059a25e51fd64beef907372cd3"} Apr 16 18:27:35.228857 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.228818 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-sxf6f" event={"ID":"95a3dbb4-48a9-4b3e-9073-dd14bb5891f7","Type":"ContainerStarted","Data":"0d6f07ae61c34722884cd0df558e768badbd1b7e7434632bf9140fd881064bd0"} Apr 16 18:27:35.229324 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.228889 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:27:35.230294 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.230260 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" event={"ID":"08efaeba-fdcb-44d5-bec2-39a299e6eb3d","Type":"ContainerStarted","Data":"bf80f4723226d4cc5c01f480c374462964084335276b970c9209998d10f2d2f2"} Apr 16 18:27:35.230576 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.230559 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:27:35.249043 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.249004 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-sxf6f" podStartSLOduration=2.034504958 podStartE2EDuration="5.248977042s" podCreationTimestamp="2026-04-16 18:27:30 +0000 UTC" firstStartedPulling="2026-04-16 18:27:31.468770696 +0000 UTC m=+627.561058639" lastFinishedPulling="2026-04-16 18:27:34.683242776 +0000 UTC m=+630.775530723" observedRunningTime="2026-04-16 18:27:35.247383916 +0000 UTC m=+631.339671882" watchObservedRunningTime="2026-04-16 18:27:35.248977042 +0000 UTC m=+631.341265007" Apr 16 18:27:35.265805 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:35.265767 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" podStartSLOduration=2.084546873 podStartE2EDuration="5.265754844s" podCreationTimestamp="2026-04-16 18:27:30 +0000 UTC" firstStartedPulling="2026-04-16 18:27:31.448287105 +0000 UTC m=+627.540575047" lastFinishedPulling="2026-04-16 18:27:34.629495062 +0000 UTC m=+630.721783018" observedRunningTime="2026-04-16 18:27:35.264638119 +0000 UTC m=+631.356926083" watchObservedRunningTime="2026-04-16 18:27:35.265754844 +0000 UTC m=+631.358042808" Apr 16 18:27:41.235674 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:27:41.235642 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-sxf6f" Apr 16 18:28:06.235623 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:06.235590 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-tmms8" Apr 16 18:28:41.423123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.423094 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-pmh5s"] Apr 16 18:28:41.426527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.426506 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:41.429176 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.429156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 18:28:41.429279 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.429162 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-m6xp2\"" Apr 16 18:28:41.436101 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.436079 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pmh5s"] Apr 16 18:28:41.547334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.547303 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:41.547485 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.547350 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxvgj\" (UniqueName: \"kubernetes.io/projected/f661a39a-8433-4d1d-9c24-aea36bb0c831-kube-api-access-nxvgj\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:41.647753 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.647718 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:41.647900 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.647766 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nxvgj\" (UniqueName: \"kubernetes.io/projected/f661a39a-8433-4d1d-9c24-aea36bb0c831-kube-api-access-nxvgj\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:41.647945 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:28:41.647893 2566 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 18:28:41.647984 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:28:41.647967 2566 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert podName:f661a39a-8433-4d1d-9c24-aea36bb0c831 nodeName:}" failed. No retries permitted until 2026-04-16 18:28:42.147949087 +0000 UTC m=+698.240237030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert") pod "odh-model-controller-696fc77849-pmh5s" (UID: "f661a39a-8433-4d1d-9c24-aea36bb0c831") : secret "odh-model-controller-webhook-cert" not found Apr 16 18:28:41.659713 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:41.659685 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxvgj\" (UniqueName: \"kubernetes.io/projected/f661a39a-8433-4d1d-9c24-aea36bb0c831-kube-api-access-nxvgj\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:42.152360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:42.152323 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:42.154653 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:42.154625 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f661a39a-8433-4d1d-9c24-aea36bb0c831-cert\") pod \"odh-model-controller-696fc77849-pmh5s\" (UID: \"f661a39a-8433-4d1d-9c24-aea36bb0c831\") " pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:42.337118 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:42.337081 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:42.452807 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:42.452787 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-pmh5s"] Apr 16 18:28:42.455364 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:28:42.455340 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf661a39a_8433_4d1d_9c24_aea36bb0c831.slice/crio-ef64a5074379680fb7e3ca07c266b4f850d518b47e7fa6fd9feef280cd76595b WatchSource:0}: Error finding container ef64a5074379680fb7e3ca07c266b4f850d518b47e7fa6fd9feef280cd76595b: Status 404 returned error can't find the container with id ef64a5074379680fb7e3ca07c266b4f850d518b47e7fa6fd9feef280cd76595b Apr 16 18:28:43.420876 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:43.420819 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pmh5s" event={"ID":"f661a39a-8433-4d1d-9c24-aea36bb0c831","Type":"ContainerStarted","Data":"ef64a5074379680fb7e3ca07c266b4f850d518b47e7fa6fd9feef280cd76595b"} Apr 16 18:28:45.428017 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:45.427961 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-pmh5s" event={"ID":"f661a39a-8433-4d1d-9c24-aea36bb0c831","Type":"ContainerStarted","Data":"36d465fd4de8962734940203ad012e6104f63b37797bae523a4c5fff9de45219"} Apr 16 18:28:45.428358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:45.428103 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:45.448146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:45.448103 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-pmh5s" podStartSLOduration=1.687820947 podStartE2EDuration="4.448090945s" podCreationTimestamp="2026-04-16 18:28:41 +0000 UTC" firstStartedPulling="2026-04-16 18:28:42.456658882 +0000 UTC m=+698.548946829" lastFinishedPulling="2026-04-16 18:28:45.21692887 +0000 UTC m=+701.309216827" observedRunningTime="2026-04-16 18:28:45.446288759 +0000 UTC m=+701.538576725" watchObservedRunningTime="2026-04-16 18:28:45.448090945 +0000 UTC m=+701.540378925" Apr 16 18:28:56.432895 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:56.432821 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-pmh5s" Apr 16 18:28:57.276120 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.276089 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-hdzgn"] Apr 16 18:28:57.279268 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.279254 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hdzgn" Apr 16 18:28:57.284194 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.284170 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hdzgn"] Apr 16 18:28:57.385676 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.385645 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdbng\" (UniqueName: \"kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng\") pod \"s3-init-hdzgn\" (UID: \"5d51deaf-b55f-4111-afb9-e258cd821a00\") " pod="kserve/s3-init-hdzgn" Apr 16 18:28:57.486680 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.486651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdbng\" (UniqueName: \"kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng\") pod \"s3-init-hdzgn\" (UID: \"5d51deaf-b55f-4111-afb9-e258cd821a00\") " pod="kserve/s3-init-hdzgn" Apr 16 18:28:57.496483 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.496460 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdbng\" (UniqueName: \"kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng\") pod \"s3-init-hdzgn\" (UID: \"5d51deaf-b55f-4111-afb9-e258cd821a00\") " pod="kserve/s3-init-hdzgn" Apr 16 18:28:57.599289 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.599261 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hdzgn" Apr 16 18:28:57.721355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:57.721330 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-hdzgn"] Apr 16 18:28:57.723699 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:28:57.723669 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d51deaf_b55f_4111_afb9_e258cd821a00.slice/crio-8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e WatchSource:0}: Error finding container 8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e: Status 404 returned error can't find the container with id 8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e Apr 16 18:28:58.465415 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:28:58.465372 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hdzgn" event={"ID":"5d51deaf-b55f-4111-afb9-e258cd821a00","Type":"ContainerStarted","Data":"8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e"} Apr 16 18:29:02.480313 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:02.480237 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hdzgn" event={"ID":"5d51deaf-b55f-4111-afb9-e258cd821a00","Type":"ContainerStarted","Data":"23d7c1baddd73c54290a3f5476fb2ff4113e1644b6c662aab6dd785dc4823126"} Apr 16 18:29:02.498231 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:02.498187 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-hdzgn" podStartSLOduration=1.113108588 podStartE2EDuration="5.49817621s" podCreationTimestamp="2026-04-16 18:28:57 +0000 UTC" firstStartedPulling="2026-04-16 18:28:57.725436149 +0000 UTC m=+713.817724092" lastFinishedPulling="2026-04-16 18:29:02.110503756 +0000 UTC m=+718.202791714" observedRunningTime="2026-04-16 18:29:02.497076777 +0000 UTC m=+718.589364745" watchObservedRunningTime="2026-04-16 18:29:02.49817621 +0000 UTC m=+718.590464174" Apr 16 18:29:05.490115 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:05.490025 2566 generic.go:358] "Generic (PLEG): container finished" podID="5d51deaf-b55f-4111-afb9-e258cd821a00" containerID="23d7c1baddd73c54290a3f5476fb2ff4113e1644b6c662aab6dd785dc4823126" exitCode=0 Apr 16 18:29:05.490527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:05.490106 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hdzgn" event={"ID":"5d51deaf-b55f-4111-afb9-e258cd821a00","Type":"ContainerDied","Data":"23d7c1baddd73c54290a3f5476fb2ff4113e1644b6c662aab6dd785dc4823126"} Apr 16 18:29:06.614821 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:06.614798 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hdzgn" Apr 16 18:29:06.773582 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:06.773498 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdbng\" (UniqueName: \"kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng\") pod \"5d51deaf-b55f-4111-afb9-e258cd821a00\" (UID: \"5d51deaf-b55f-4111-afb9-e258cd821a00\") " Apr 16 18:29:06.775577 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:06.775553 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng" (OuterVolumeSpecName: "kube-api-access-hdbng") pod "5d51deaf-b55f-4111-afb9-e258cd821a00" (UID: "5d51deaf-b55f-4111-afb9-e258cd821a00"). InnerVolumeSpecName "kube-api-access-hdbng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:29:06.874649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:06.874615 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hdbng\" (UniqueName: \"kubernetes.io/projected/5d51deaf-b55f-4111-afb9-e258cd821a00-kube-api-access-hdbng\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:29:07.496179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:07.496149 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-hdzgn" event={"ID":"5d51deaf-b55f-4111-afb9-e258cd821a00","Type":"ContainerDied","Data":"8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e"} Apr 16 18:29:07.496179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:07.496179 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb349b01d5e5e2d64796ca5aa47fbbaeca05e4c460ec8b43bdf3b0b7fcf816e" Apr 16 18:29:07.496388 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:07.496186 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-hdzgn" Apr 16 18:29:47.171829 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.171789 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-custom-44qth"] Apr 16 18:29:47.172358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.172209 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5d51deaf-b55f-4111-afb9-e258cd821a00" containerName="s3-init" Apr 16 18:29:47.172358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.172225 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d51deaf-b55f-4111-afb9-e258cd821a00" containerName="s3-init" Apr 16 18:29:47.172358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.172284 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5d51deaf-b55f-4111-afb9-e258cd821a00" containerName="s3-init" Apr 16 18:29:47.174356 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.174337 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:47.177099 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.177080 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-custom-artifact\"" Apr 16 18:29:47.183882 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.183860 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-44qth"] Apr 16 18:29:47.327954 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.327919 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mghb9\" (UniqueName: \"kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9\") pod \"s3-tls-init-custom-44qth\" (UID: \"fd984710-2bda-435c-862c-c3bbe07c161f\") " pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:47.429400 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.429305 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mghb9\" (UniqueName: \"kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9\") pod \"s3-tls-init-custom-44qth\" (UID: \"fd984710-2bda-435c-862c-c3bbe07c161f\") " pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:47.439325 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.439297 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mghb9\" (UniqueName: \"kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9\") pod \"s3-tls-init-custom-44qth\" (UID: \"fd984710-2bda-435c-862c-c3bbe07c161f\") " pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:47.497744 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.497712 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:47.613987 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:47.613964 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-custom-44qth"] Apr 16 18:29:47.616286 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:29:47.616251 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd984710_2bda_435c_862c_c3bbe07c161f.slice/crio-80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb WatchSource:0}: Error finding container 80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb: Status 404 returned error can't find the container with id 80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb Apr 16 18:29:48.618345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:48.618316 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-44qth" event={"ID":"fd984710-2bda-435c-862c-c3bbe07c161f","Type":"ContainerStarted","Data":"91ed86bec9f489ca883ba7a44d29856238772f470e95f0451dc8bec3b7e72471"} Apr 16 18:29:48.618345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:48.618347 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-44qth" event={"ID":"fd984710-2bda-435c-862c-c3bbe07c161f","Type":"ContainerStarted","Data":"80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb"} Apr 16 18:29:48.641151 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:48.637674 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-custom-44qth" podStartSLOduration=1.637658491 podStartE2EDuration="1.637658491s" podCreationTimestamp="2026-04-16 18:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:48.635344115 +0000 UTC m=+764.727632083" watchObservedRunningTime="2026-04-16 18:29:48.637658491 +0000 UTC m=+764.729946457" Apr 16 18:29:51.628750 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:51.628720 2566 generic.go:358] "Generic (PLEG): container finished" podID="fd984710-2bda-435c-862c-c3bbe07c161f" containerID="91ed86bec9f489ca883ba7a44d29856238772f470e95f0451dc8bec3b7e72471" exitCode=0 Apr 16 18:29:51.629130 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:51.628795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-44qth" event={"ID":"fd984710-2bda-435c-862c-c3bbe07c161f","Type":"ContainerDied","Data":"91ed86bec9f489ca883ba7a44d29856238772f470e95f0451dc8bec3b7e72471"} Apr 16 18:29:52.752555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:52.752533 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:52.776926 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:52.776904 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mghb9\" (UniqueName: \"kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9\") pod \"fd984710-2bda-435c-862c-c3bbe07c161f\" (UID: \"fd984710-2bda-435c-862c-c3bbe07c161f\") " Apr 16 18:29:52.778896 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:52.778870 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9" (OuterVolumeSpecName: "kube-api-access-mghb9") pod "fd984710-2bda-435c-862c-c3bbe07c161f" (UID: "fd984710-2bda-435c-862c-c3bbe07c161f"). InnerVolumeSpecName "kube-api-access-mghb9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:29:52.878067 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:52.878029 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mghb9\" (UniqueName: \"kubernetes.io/projected/fd984710-2bda-435c-862c-c3bbe07c161f-kube-api-access-mghb9\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:29:53.635170 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:53.635140 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-custom-44qth" Apr 16 18:29:53.635337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:53.635138 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-custom-44qth" event={"ID":"fd984710-2bda-435c-862c-c3bbe07c161f","Type":"ContainerDied","Data":"80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb"} Apr 16 18:29:53.635337 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:53.635244 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c6fc0b32ebe1bcdad790912e74327fae3e0a5ec4b4f6a56ebad407224caeeb" Apr 16 18:29:56.375963 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.375927 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-tls-init-serving-vlrxn"] Apr 16 18:29:56.376343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.376323 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fd984710-2bda-435c-862c-c3bbe07c161f" containerName="s3-tls-init-custom" Apr 16 18:29:56.376343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.376339 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd984710-2bda-435c-862c-c3bbe07c161f" containerName="s3-tls-init-custom" Apr 16 18:29:56.376432 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.376407 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fd984710-2bda-435c-862c-c3bbe07c161f" containerName="s3-tls-init-custom" Apr 16 18:29:56.378208 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.378194 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:29:56.380812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.380790 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"seaweedfs-tls-serving-artifact\"" Apr 16 18:29:56.386761 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.386739 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-vlrxn"] Apr 16 18:29:56.406294 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.406268 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dtcr\" (UniqueName: \"kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr\") pod \"s3-tls-init-serving-vlrxn\" (UID: \"527f11d7-de15-47e3-a52b-b37ec6de8cca\") " pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:29:56.507263 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.507232 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dtcr\" (UniqueName: \"kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr\") pod \"s3-tls-init-serving-vlrxn\" (UID: \"527f11d7-de15-47e3-a52b-b37ec6de8cca\") " pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:29:56.517091 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.517064 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dtcr\" (UniqueName: \"kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr\") pod \"s3-tls-init-serving-vlrxn\" (UID: \"527f11d7-de15-47e3-a52b-b37ec6de8cca\") " pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:29:56.700674 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.700588 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:29:56.820013 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:56.819976 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-tls-init-serving-vlrxn"] Apr 16 18:29:56.822051 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:29:56.822024 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod527f11d7_de15_47e3_a52b_b37ec6de8cca.slice/crio-016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399 WatchSource:0}: Error finding container 016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399: Status 404 returned error can't find the container with id 016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399 Apr 16 18:29:57.648322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:57.648288 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-vlrxn" event={"ID":"527f11d7-de15-47e3-a52b-b37ec6de8cca","Type":"ContainerStarted","Data":"946f17b8172e87f2d7ccee68951b780d96e318b267da95e1a3374b55b90458b7"} Apr 16 18:29:57.648322 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:57.648323 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-vlrxn" event={"ID":"527f11d7-de15-47e3-a52b-b37ec6de8cca","Type":"ContainerStarted","Data":"016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399"} Apr 16 18:29:57.666815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:29:57.666770 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-tls-init-serving-vlrxn" podStartSLOduration=1.666755727 podStartE2EDuration="1.666755727s" podCreationTimestamp="2026-04-16 18:29:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:29:57.664144816 +0000 UTC m=+773.756432778" watchObservedRunningTime="2026-04-16 18:29:57.666755727 +0000 UTC m=+773.759043692" Apr 16 18:30:02.664238 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:02.664199 2566 generic.go:358] "Generic (PLEG): container finished" podID="527f11d7-de15-47e3-a52b-b37ec6de8cca" containerID="946f17b8172e87f2d7ccee68951b780d96e318b267da95e1a3374b55b90458b7" exitCode=0 Apr 16 18:30:02.664624 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:02.664283 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-vlrxn" event={"ID":"527f11d7-de15-47e3-a52b-b37ec6de8cca","Type":"ContainerDied","Data":"946f17b8172e87f2d7ccee68951b780d96e318b267da95e1a3374b55b90458b7"} Apr 16 18:30:03.789273 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:03.789247 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:30:03.869396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:03.869366 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dtcr\" (UniqueName: \"kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr\") pod \"527f11d7-de15-47e3-a52b-b37ec6de8cca\" (UID: \"527f11d7-de15-47e3-a52b-b37ec6de8cca\") " Apr 16 18:30:03.871338 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:03.871315 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr" (OuterVolumeSpecName: "kube-api-access-5dtcr") pod "527f11d7-de15-47e3-a52b-b37ec6de8cca" (UID: "527f11d7-de15-47e3-a52b-b37ec6de8cca"). InnerVolumeSpecName "kube-api-access-5dtcr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 18:30:03.970489 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:03.970417 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5dtcr\" (UniqueName: \"kubernetes.io/projected/527f11d7-de15-47e3-a52b-b37ec6de8cca-kube-api-access-5dtcr\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:30:04.670788 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:04.670755 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-tls-init-serving-vlrxn" Apr 16 18:30:04.670788 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:04.670778 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-tls-init-serving-vlrxn" event={"ID":"527f11d7-de15-47e3-a52b-b37ec6de8cca","Type":"ContainerDied","Data":"016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399"} Apr 16 18:30:04.670983 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:04.670803 2566 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="016575b7ca32999af795141d9a8929941355b25933477c25946c69a19e3c3399" Apr 16 18:30:12.088102 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.088062 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:30:12.088576 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.088556 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="527f11d7-de15-47e3-a52b-b37ec6de8cca" containerName="s3-tls-init-serving" Apr 16 18:30:12.088647 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.088580 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="527f11d7-de15-47e3-a52b-b37ec6de8cca" containerName="s3-tls-init-serving" Apr 16 18:30:12.088713 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.088663 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="527f11d7-de15-47e3-a52b-b37ec6de8cca" containerName="s3-tls-init-serving" Apr 16 18:30:12.091073 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.091050 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:12.093583 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.093562 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdsxq\"" Apr 16 18:30:12.102552 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.102529 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:30:12.142606 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.142574 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq\" (UID: \"e21c0c62-9da4-45fa-9593-be3ef2e83ce8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:12.243526 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.243496 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq\" (UID: \"e21c0c62-9da4-45fa-9593-be3ef2e83ce8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:12.243886 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.243865 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location\") pod \"isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq\" (UID: \"e21c0c62-9da4-45fa-9593-be3ef2e83ce8\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:12.400943 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.400862 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:12.526645 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.526622 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:30:12.528499 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:30:12.528467 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode21c0c62_9da4_45fa_9593_be3ef2e83ce8.slice/crio-17affb3314befa5d7fbb92b635be98d7d85efabcac711a0c26f560c98d13cc3e WatchSource:0}: Error finding container 17affb3314befa5d7fbb92b635be98d7d85efabcac711a0c26f560c98d13cc3e: Status 404 returned error can't find the container with id 17affb3314befa5d7fbb92b635be98d7d85efabcac711a0c26f560c98d13cc3e Apr 16 18:30:12.700343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:12.700252 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerStarted","Data":"17affb3314befa5d7fbb92b635be98d7d85efabcac711a0c26f560c98d13cc3e"} Apr 16 18:30:16.715434 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:16.715399 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerStarted","Data":"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3"} Apr 16 18:30:20.729822 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:20.729787 2566 generic.go:358] "Generic (PLEG): container finished" podID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerID="eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3" exitCode=0 Apr 16 18:30:20.729822 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:20.729826 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerDied","Data":"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3"} Apr 16 18:30:34.780046 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:34.780013 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerStarted","Data":"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6"} Apr 16 18:30:37.791195 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.791154 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerStarted","Data":"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139"} Apr 16 18:30:37.791570 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.791317 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:37.791570 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.791449 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:30:37.792817 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.792760 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:30:37.793389 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.793363 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:37.809083 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:37.808974 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podStartSLOduration=1.232812252 podStartE2EDuration="25.808961107s" podCreationTimestamp="2026-04-16 18:30:12 +0000 UTC" firstStartedPulling="2026-04-16 18:30:12.530385895 +0000 UTC m=+788.622673837" lastFinishedPulling="2026-04-16 18:30:37.106534746 +0000 UTC m=+813.198822692" observedRunningTime="2026-04-16 18:30:37.808243752 +0000 UTC m=+813.900531711" watchObservedRunningTime="2026-04-16 18:30:37.808961107 +0000 UTC m=+813.901249073" Apr 16 18:30:38.794159 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:38.794118 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:30:38.794589 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:38.794389 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:48.794743 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:48.794696 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:30:48.795235 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:48.795106 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:30:58.794148 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:58.794105 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:30:58.794629 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:30:58.794523 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:08.794601 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:08.794549 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:31:08.795061 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:08.795038 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:18.794586 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:18.794533 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:31:18.795063 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:18.795038 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:28.794400 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:28.794354 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:31:28.794856 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:28.794831 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:38.794797 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:38.794747 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:31:38.795252 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:38.795153 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:48.794818 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:48.794774 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:31:48.795366 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:48.795212 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:31:57.259752 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.255707 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:31:57.260899 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.260828 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" containerID="cri-o://405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6" gracePeriod=30 Apr 16 18:31:57.261090 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.260909 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" containerID="cri-o://9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139" gracePeriod=30 Apr 16 18:31:57.333330 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.333302 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:31:57.335957 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.335938 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:31:57.345413 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.345389 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:31:57.439524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.439485 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-746b56d97c-np249\" (UID: \"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:31:57.540394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.540306 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-746b56d97c-np249\" (UID: \"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:31:57.540694 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.540672 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location\") pod \"isvc-sklearn-batcher-custom-predictor-746b56d97c-np249\" (UID: \"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95\") " pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:31:57.646646 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.646614 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:31:57.767854 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:57.767823 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:31:57.770482 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:31:57.770449 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda29a5a0f_1bd0_4e47_845b_87f8f98a5c95.slice/crio-316dc6b757c584e593fe17c6f951220ee157ab642f9f055a226422845f85273e WatchSource:0}: Error finding container 316dc6b757c584e593fe17c6f951220ee157ab642f9f055a226422845f85273e: Status 404 returned error can't find the container with id 316dc6b757c584e593fe17c6f951220ee157ab642f9f055a226422845f85273e Apr 16 18:31:58.037074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:58.037035 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerStarted","Data":"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e"} Apr 16 18:31:58.037074 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:58.037078 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerStarted","Data":"316dc6b757c584e593fe17c6f951220ee157ab642f9f055a226422845f85273e"} Apr 16 18:31:58.794436 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:58.794398 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:31:58.795556 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:31:58.795532 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:32:02.049915 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:02.049880 2566 generic.go:358] "Generic (PLEG): container finished" podID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerID="405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6" exitCode=0 Apr 16 18:32:02.050352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:02.049950 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerDied","Data":"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6"} Apr 16 18:32:02.051177 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:02.051155 2566 generic.go:358] "Generic (PLEG): container finished" podID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerID="2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e" exitCode=0 Apr 16 18:32:02.051286 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:02.051228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerDied","Data":"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e"} Apr 16 18:32:03.056961 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:03.056918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerStarted","Data":"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee"} Apr 16 18:32:03.057390 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:03.056974 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerStarted","Data":"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e"} Apr 16 18:32:03.057493 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:03.057470 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:32:03.058847 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:03.058819 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:03.077692 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:03.077651 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podStartSLOduration=6.077638792 podStartE2EDuration="6.077638792s" podCreationTimestamp="2026-04-16 18:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:32:03.075668937 +0000 UTC m=+899.167956914" watchObservedRunningTime="2026-04-16 18:32:03.077638792 +0000 UTC m=+899.169926756" Apr 16 18:32:04.060538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:04.060506 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:32:04.061045 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:04.060636 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:04.061648 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:04.061622 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:05.063164 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:05.063125 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:05.063625 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:05.063358 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:08.795216 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:08.795167 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:08.795580 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:08.795315 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:32:15.063157 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:15.063109 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:15.063596 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:15.063568 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:18.794989 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:18.794942 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:18.795477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:18.795109 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:32:18.795477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:18.795362 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.25:8080: connect: connection refused" Apr 16 18:32:18.795576 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:18.795483 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:32:25.064101 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:25.064052 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:25.064643 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:25.064619 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:27.401895 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:27.401874 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:32:27.591815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:27.591782 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location\") pod \"e21c0c62-9da4-45fa-9593-be3ef2e83ce8\" (UID: \"e21c0c62-9da4-45fa-9593-be3ef2e83ce8\") " Apr 16 18:32:27.592146 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:27.592121 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e21c0c62-9da4-45fa-9593-be3ef2e83ce8" (UID: "e21c0c62-9da4-45fa-9593-be3ef2e83ce8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:32:27.692763 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:27.692731 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e21c0c62-9da4-45fa-9593-be3ef2e83ce8-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:32:28.139524 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.139484 2566 generic.go:358] "Generic (PLEG): container finished" podID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerID="9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139" exitCode=0 Apr 16 18:32:28.139886 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.139564 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerDied","Data":"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139"} Apr 16 18:32:28.139886 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.139588 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" Apr 16 18:32:28.139886 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.139607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq" event={"ID":"e21c0c62-9da4-45fa-9593-be3ef2e83ce8","Type":"ContainerDied","Data":"17affb3314befa5d7fbb92b635be98d7d85efabcac711a0c26f560c98d13cc3e"} Apr 16 18:32:28.139886 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.139624 2566 scope.go:117] "RemoveContainer" containerID="9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139" Apr 16 18:32:28.148034 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.148018 2566 scope.go:117] "RemoveContainer" containerID="405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6" Apr 16 18:32:28.155493 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.155477 2566 scope.go:117] "RemoveContainer" containerID="eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3" Apr 16 18:32:28.165123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.165092 2566 scope.go:117] "RemoveContainer" containerID="9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139" Apr 16 18:32:28.165434 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:32:28.165412 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139\": container with ID starting with 9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139 not found: ID does not exist" containerID="9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139" Apr 16 18:32:28.165512 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.165443 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139"} err="failed to get container status \"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139\": rpc error: code = NotFound desc = could not find container \"9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139\": container with ID starting with 9040faa3d4c64b0d4b0bbdb62b6543476e4920018f3fc842369ae03917623139 not found: ID does not exist" Apr 16 18:32:28.165512 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.165480 2566 scope.go:117] "RemoveContainer" containerID="405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6" Apr 16 18:32:28.165765 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:32:28.165749 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6\": container with ID starting with 405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6 not found: ID does not exist" containerID="405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6" Apr 16 18:32:28.165833 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.165769 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6"} err="failed to get container status \"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6\": rpc error: code = NotFound desc = could not find container \"405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6\": container with ID starting with 405551bf05c02601407c14b43f86608df66a1ed9e342f6c41be9264dfbd302f6 not found: ID does not exist" Apr 16 18:32:28.165833 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.165782 2566 scope.go:117] "RemoveContainer" containerID="eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3" Apr 16 18:32:28.166049 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:32:28.166028 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3\": container with ID starting with eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3 not found: ID does not exist" containerID="eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3" Apr 16 18:32:28.166110 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.166056 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3"} err="failed to get container status \"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3\": rpc error: code = NotFound desc = could not find container \"eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3\": container with ID starting with eae6bccc8c6990a08b815d3aa7d772c882dd5bcc3bcbb0e6cf303dde5dedf8d3 not found: ID does not exist" Apr 16 18:32:28.166581 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.166564 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:32:28.168971 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.168951 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-predictor-6f85f4fcc6-mtglq"] Apr 16 18:32:28.452185 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:28.452111 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" path="/var/lib/kubelet/pods/e21c0c62-9da4-45fa-9593-be3ef2e83ce8/volumes" Apr 16 18:32:35.063644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:35.063598 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:35.064069 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:35.063951 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:45.063978 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:45.063929 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:45.066364 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:45.064300 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:32:55.063916 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:55.063870 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:32:55.064369 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:32:55.064257 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:05.063474 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:05.063426 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:33:05.063921 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:05.063898 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:15.063655 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:15.063623 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:15.064123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:15.063961 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:22.476057 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:22.476026 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:33:22.476443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:22.476309 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" containerID="cri-o://16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e" gracePeriod=30 Apr 16 18:33:22.476443 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:22.476337 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" containerID="cri-o://d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee" gracePeriod=30 Apr 16 18:33:25.063941 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:25.063849 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:25.064344 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:25.064236 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:33:27.314438 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:27.314404 2566 generic.go:358] "Generic (PLEG): container finished" podID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerID="16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e" exitCode=0 Apr 16 18:33:27.314811 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:27.314472 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerDied","Data":"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e"} Apr 16 18:33:32.538961 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.538928 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539274 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="storage-initializer" Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539285 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="storage-initializer" Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539302 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539307 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539317 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" Apr 16 18:33:32.539352 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539322 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" Apr 16 18:33:32.539559 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539376 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="kserve-container" Apr 16 18:33:32.539559 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.539387 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="e21c0c62-9da4-45fa-9593-be3ef2e83ce8" containerName="agent" Apr 16 18:33:32.542430 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.542414 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:32.550747 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.550723 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:33:32.627909 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.627868 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location\") pod \"isvc-logger-predictor-67b779c5c9-pxbhm\" (UID: \"4868eaf0-9b8d-491a-a32b-c3a7608a98fa\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:32.728982 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.728945 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location\") pod \"isvc-logger-predictor-67b779c5c9-pxbhm\" (UID: \"4868eaf0-9b8d-491a-a32b-c3a7608a98fa\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:32.729373 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.729351 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location\") pod \"isvc-logger-predictor-67b779c5c9-pxbhm\" (UID: \"4868eaf0-9b8d-491a-a32b-c3a7608a98fa\") " pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:32.853177 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.853140 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:32.972988 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.972954 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:33:32.976398 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:33:32.976361 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4868eaf0_9b8d_491a_a32b_c3a7608a98fa.slice/crio-7a21ddf5b2ff0c3b9d4b4e6babb396a65e8b34f52319fbb21966948163eecc19 WatchSource:0}: Error finding container 7a21ddf5b2ff0c3b9d4b4e6babb396a65e8b34f52319fbb21966948163eecc19: Status 404 returned error can't find the container with id 7a21ddf5b2ff0c3b9d4b4e6babb396a65e8b34f52319fbb21966948163eecc19 Apr 16 18:33:32.978186 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:32.978169 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:33:33.333714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:33.333626 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerStarted","Data":"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951"} Apr 16 18:33:33.333714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:33.333662 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerStarted","Data":"7a21ddf5b2ff0c3b9d4b4e6babb396a65e8b34f52319fbb21966948163eecc19"} Apr 16 18:33:35.063716 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:35.063664 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:35.064211 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:35.064160 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:33:37.348150 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:37.348112 2566 generic.go:358] "Generic (PLEG): container finished" podID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerID="d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951" exitCode=0 Apr 16 18:33:37.348541 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:37.348187 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerDied","Data":"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951"} Apr 16 18:33:38.353457 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.353424 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerStarted","Data":"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af"} Apr 16 18:33:38.353457 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.353464 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerStarted","Data":"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528"} Apr 16 18:33:38.353927 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.353762 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:38.353927 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.353794 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:33:38.355212 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.355173 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:33:38.355806 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.355786 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:38.375093 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:38.375053 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podStartSLOduration=6.375040923 podStartE2EDuration="6.375040923s" podCreationTimestamp="2026-04-16 18:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:33:38.37314459 +0000 UTC m=+994.465432554" watchObservedRunningTime="2026-04-16 18:33:38.375040923 +0000 UTC m=+994.467328885" Apr 16 18:33:39.356472 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:39.356424 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:33:39.356914 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:39.356832 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:45.064219 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:45.064170 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.26:5000: connect: connection refused" Apr 16 18:33:45.064629 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:45.064190 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:45.064629 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:45.064306 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:45.064629 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:45.064368 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:49.356620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:49.356566 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:33:49.357175 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:49.357144 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:33:53.117116 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.117088 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:53.200188 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.200153 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location\") pod \"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95\" (UID: \"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95\") " Apr 16 18:33:53.200472 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.200447 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" (UID: "a29a5a0f-1bd0-4e47-845b-87f8f98a5c95"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:33:53.301527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.301451 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:33:53.397468 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.397438 2566 generic.go:358] "Generic (PLEG): container finished" podID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerID="d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee" exitCode=0 Apr 16 18:33:53.397614 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.397483 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerDied","Data":"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee"} Apr 16 18:33:53.397614 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.397507 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" event={"ID":"a29a5a0f-1bd0-4e47-845b-87f8f98a5c95","Type":"ContainerDied","Data":"316dc6b757c584e593fe17c6f951220ee157ab642f9f055a226422845f85273e"} Apr 16 18:33:53.397614 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.397511 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249" Apr 16 18:33:53.397614 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.397525 2566 scope.go:117] "RemoveContainer" containerID="d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee" Apr 16 18:33:53.407224 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.407136 2566 scope.go:117] "RemoveContainer" containerID="16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e" Apr 16 18:33:53.414538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.414523 2566 scope.go:117] "RemoveContainer" containerID="2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e" Apr 16 18:33:53.421062 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421043 2566 scope.go:117] "RemoveContainer" containerID="d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee" Apr 16 18:33:53.421294 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:33:53.421278 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee\": container with ID starting with d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee not found: ID does not exist" containerID="d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee" Apr 16 18:33:53.421344 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421303 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee"} err="failed to get container status \"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee\": rpc error: code = NotFound desc = could not find container \"d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee\": container with ID starting with d18fcd61aba3df8eb651ca8085791236a0a3370c909b40dff424c2dfbbbb75ee not found: ID does not exist" Apr 16 18:33:53.421344 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421321 2566 scope.go:117] "RemoveContainer" containerID="16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e" Apr 16 18:33:53.421561 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:33:53.421541 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e\": container with ID starting with 16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e not found: ID does not exist" containerID="16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e" Apr 16 18:33:53.421624 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421571 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e"} err="failed to get container status \"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e\": rpc error: code = NotFound desc = could not find container \"16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e\": container with ID starting with 16dcd278f3a8a24a7b18f343a3729999bb8f94d4f1609e2f328288f53171644e not found: ID does not exist" Apr 16 18:33:53.421624 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421595 2566 scope.go:117] "RemoveContainer" containerID="2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e" Apr 16 18:33:53.421844 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:33:53.421827 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e\": container with ID starting with 2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e not found: ID does not exist" containerID="2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e" Apr 16 18:33:53.421885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.421848 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e"} err="failed to get container status \"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e\": rpc error: code = NotFound desc = could not find container \"2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e\": container with ID starting with 2d4bb03fc77c16068610af5a6471be003114ac65972623127d6e0613af9e2e8e not found: ID does not exist" Apr 16 18:33:53.425010 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.424972 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:33:53.429965 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:53.429944 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-batcher-custom-predictor-746b56d97c-np249"] Apr 16 18:33:54.452148 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:54.452112 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" path="/var/lib/kubelet/pods/a29a5a0f-1bd0-4e47-845b-87f8f98a5c95/volumes" Apr 16 18:33:59.356577 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:59.356535 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:33:59.357136 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:33:59.357111 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:34:09.356877 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:09.356831 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:34:09.357411 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:09.357381 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:34:19.356555 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:19.356508 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:34:19.357166 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:19.357139 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:34:29.356639 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:29.356586 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:34:29.357192 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:29.357048 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:34:39.356622 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:39.356560 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:34:39.357119 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:39.357034 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:34:49.357748 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:49.357712 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:34:49.358162 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:49.357826 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:34:57.801426 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.801346 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:34:57.801779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.801681 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" containerID="cri-o://69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528" gracePeriod=30 Apr 16 18:34:57.801779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.801700 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" containerID="cri-o://f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af" gracePeriod=30 Apr 16 18:34:57.835958 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.835930 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:34:57.836282 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836270 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" Apr 16 18:34:57.836334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836283 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" Apr 16 18:34:57.836334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836293 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" Apr 16 18:34:57.836334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836299 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" Apr 16 18:34:57.836334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836314 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="storage-initializer" Apr 16 18:34:57.836334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836320 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="storage-initializer" Apr 16 18:34:57.836481 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836370 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="kserve-container" Apr 16 18:34:57.836481 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.836381 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a29a5a0f-1bd0-4e47-845b-87f8f98a5c95" containerName="agent" Apr 16 18:34:57.839569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.839553 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:34:57.853850 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.853825 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:34:57.935226 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:57.935193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-h655z\" (UID: \"a0c1255b-aa08-432e-805b-bfedba0c115e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:34:58.036501 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.036472 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-h655z\" (UID: \"a0c1255b-aa08-432e-805b-bfedba0c115e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:34:58.036871 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.036850 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location\") pod \"isvc-lightgbm-predictor-78c8d484d6-h655z\" (UID: \"a0c1255b-aa08-432e-805b-bfedba0c115e\") " pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:34:58.149558 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.149521 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:34:58.269947 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.269831 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:34:58.272548 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:34:58.272516 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c1255b_aa08_432e_805b_bfedba0c115e.slice/crio-d35559eb5f9028716a6fc9d86ae149c261621c36a09bdc01bf831a1ddc73676d WatchSource:0}: Error finding container d35559eb5f9028716a6fc9d86ae149c261621c36a09bdc01bf831a1ddc73676d: Status 404 returned error can't find the container with id d35559eb5f9028716a6fc9d86ae149c261621c36a09bdc01bf831a1ddc73676d Apr 16 18:34:58.592929 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.592900 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerStarted","Data":"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c"} Apr 16 18:34:58.592929 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:58.592936 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerStarted","Data":"d35559eb5f9028716a6fc9d86ae149c261621c36a09bdc01bf831a1ddc73676d"} Apr 16 18:34:59.357262 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:59.357217 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:34:59.357701 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:34:59.357546 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:35:02.609691 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:02.609653 2566 generic.go:358] "Generic (PLEG): container finished" podID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerID="9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c" exitCode=0 Apr 16 18:35:02.610167 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:02.609731 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerDied","Data":"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c"} Apr 16 18:35:02.611785 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:02.611762 2566 generic.go:358] "Generic (PLEG): container finished" podID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerID="69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528" exitCode=0 Apr 16 18:35:02.611883 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:02.611817 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerDied","Data":"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528"} Apr 16 18:35:09.356355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.356312 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:35:09.356777 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.356628 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:35:09.638691 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.638615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerStarted","Data":"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168"} Apr 16 18:35:09.638921 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.638894 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:35:09.639917 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.639891 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:35:09.657624 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:09.657581 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podStartSLOduration=6.427397728 podStartE2EDuration="12.657568304s" podCreationTimestamp="2026-04-16 18:34:57 +0000 UTC" firstStartedPulling="2026-04-16 18:35:02.611219754 +0000 UTC m=+1078.703507701" lastFinishedPulling="2026-04-16 18:35:08.841390326 +0000 UTC m=+1084.933678277" observedRunningTime="2026-04-16 18:35:09.655868915 +0000 UTC m=+1085.748156880" watchObservedRunningTime="2026-04-16 18:35:09.657568304 +0000 UTC m=+1085.749856268" Apr 16 18:35:10.642436 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:10.642390 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:35:19.357320 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:19.357274 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.27:8080: connect: connection refused" Apr 16 18:35:19.357772 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:19.357440 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:35:19.358062 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:19.357612 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 18:35:19.358282 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:19.358260 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:35:20.643326 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:20.643283 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:35:27.965320 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:27.965298 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:35:27.993735 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:27.993710 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location\") pod \"4868eaf0-9b8d-491a-a32b-c3a7608a98fa\" (UID: \"4868eaf0-9b8d-491a-a32b-c3a7608a98fa\") " Apr 16 18:35:27.993979 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:27.993950 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4868eaf0-9b8d-491a-a32b-c3a7608a98fa" (UID: "4868eaf0-9b8d-491a-a32b-c3a7608a98fa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:35:28.094814 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.094788 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4868eaf0-9b8d-491a-a32b-c3a7608a98fa-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:35:28.695460 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.695424 2566 generic.go:358] "Generic (PLEG): container finished" podID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerID="f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af" exitCode=137 Apr 16 18:35:28.695610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.695489 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerDied","Data":"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af"} Apr 16 18:35:28.695610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.695499 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" Apr 16 18:35:28.695610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.695516 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm" event={"ID":"4868eaf0-9b8d-491a-a32b-c3a7608a98fa","Type":"ContainerDied","Data":"7a21ddf5b2ff0c3b9d4b4e6babb396a65e8b34f52319fbb21966948163eecc19"} Apr 16 18:35:28.695610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.695533 2566 scope.go:117] "RemoveContainer" containerID="f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af" Apr 16 18:35:28.702950 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.702931 2566 scope.go:117] "RemoveContainer" containerID="69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528" Apr 16 18:35:28.709877 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.709842 2566 scope.go:117] "RemoveContainer" containerID="d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951" Apr 16 18:35:28.716091 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.716068 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:35:28.716866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.716846 2566 scope.go:117] "RemoveContainer" containerID="f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af" Apr 16 18:35:28.717158 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:35:28.717136 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af\": container with ID starting with f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af not found: ID does not exist" containerID="f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af" Apr 16 18:35:28.717229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.717170 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af"} err="failed to get container status \"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af\": rpc error: code = NotFound desc = could not find container \"f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af\": container with ID starting with f0820ab4c65298f96291685013b27b0dd5edd2824526394aa0d96de25db393af not found: ID does not exist" Apr 16 18:35:28.717229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.717195 2566 scope.go:117] "RemoveContainer" containerID="69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528" Apr 16 18:35:28.717474 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:35:28.717449 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528\": container with ID starting with 69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528 not found: ID does not exist" containerID="69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528" Apr 16 18:35:28.717521 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.717476 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528"} err="failed to get container status \"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528\": rpc error: code = NotFound desc = could not find container \"69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528\": container with ID starting with 69054e5bf9749033e3b43fefb529f7e1c2edfce0f9598764dbe5bbb14ef9f528 not found: ID does not exist" Apr 16 18:35:28.717521 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.717491 2566 scope.go:117] "RemoveContainer" containerID="d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951" Apr 16 18:35:28.717719 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:35:28.717697 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951\": container with ID starting with d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951 not found: ID does not exist" containerID="d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951" Apr 16 18:35:28.717763 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.717717 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951"} err="failed to get container status \"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951\": rpc error: code = NotFound desc = could not find container \"d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951\": container with ID starting with d8e75c4ff424d51add3572383e57e1347092cd43d882889ebb155401db43e951 not found: ID does not exist" Apr 16 18:35:28.721338 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:28.721317 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-predictor-67b779c5c9-pxbhm"] Apr 16 18:35:30.451979 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:30.451945 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" path="/var/lib/kubelet/pods/4868eaf0-9b8d-491a-a32b-c3a7608a98fa/volumes" Apr 16 18:35:30.643323 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:30.643284 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:35:40.642596 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:40.642553 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:35:50.642650 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:35:50.642610 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:36:00.643140 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:00.643090 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:36:10.643043 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:10.642975 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:36:20.642594 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:20.642550 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:36:30.643234 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:30.643155 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:36:38.019401 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.019365 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:36:38.019794 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.019668 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" containerID="cri-o://8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168" gracePeriod=30 Apr 16 18:36:38.097755 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.097712 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:36:38.098077 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098064 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="storage-initializer" Apr 16 18:36:38.098127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098079 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="storage-initializer" Apr 16 18:36:38.098127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098093 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" Apr 16 18:36:38.098127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098098 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" Apr 16 18:36:38.098127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098104 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" Apr 16 18:36:38.098127 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098110 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" Apr 16 18:36:38.098278 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098165 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="agent" Apr 16 18:36:38.098278 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.098174 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="4868eaf0-9b8d-491a-a32b-c3a7608a98fa" containerName="kserve-container" Apr 16 18:36:38.100736 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.100715 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:38.109197 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.109174 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:36:38.180584 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.180526 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc\" (UID: \"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:38.281184 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.281105 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc\" (UID: \"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:38.281436 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.281416 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location\") pod \"isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc\" (UID: \"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef\") " pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:38.411774 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.411744 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:38.531533 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.531508 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:36:38.534397 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:36:38.534365 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa83d3c0_9447_4bd7_8f2b_545f2e3ecdef.slice/crio-13bc9f4e4d1b4050ed8636b04b4c929bab8c0425528096a35885c94de7265c64 WatchSource:0}: Error finding container 13bc9f4e4d1b4050ed8636b04b4c929bab8c0425528096a35885c94de7265c64: Status 404 returned error can't find the container with id 13bc9f4e4d1b4050ed8636b04b4c929bab8c0425528096a35885c94de7265c64 Apr 16 18:36:38.908070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.908033 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerStarted","Data":"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63"} Apr 16 18:36:38.908070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:38.908069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerStarted","Data":"13bc9f4e4d1b4050ed8636b04b4c929bab8c0425528096a35885c94de7265c64"} Apr 16 18:36:40.643222 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:40.643185 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.28:8080: connect: connection refused" Apr 16 18:36:42.467639 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.467618 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:36:42.518668 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.518596 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location\") pod \"a0c1255b-aa08-432e-805b-bfedba0c115e\" (UID: \"a0c1255b-aa08-432e-805b-bfedba0c115e\") " Apr 16 18:36:42.518981 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.518950 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a0c1255b-aa08-432e-805b-bfedba0c115e" (UID: "a0c1255b-aa08-432e-805b-bfedba0c115e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:36:42.619611 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.619580 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a0c1255b-aa08-432e-805b-bfedba0c115e-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:36:42.920751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.920712 2566 generic.go:358] "Generic (PLEG): container finished" podID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerID="8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168" exitCode=0 Apr 16 18:36:42.921014 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.920786 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" Apr 16 18:36:42.921014 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.920795 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerDied","Data":"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168"} Apr 16 18:36:42.921014 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.920839 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z" event={"ID":"a0c1255b-aa08-432e-805b-bfedba0c115e","Type":"ContainerDied","Data":"d35559eb5f9028716a6fc9d86ae149c261621c36a09bdc01bf831a1ddc73676d"} Apr 16 18:36:42.921014 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.920861 2566 scope.go:117] "RemoveContainer" containerID="8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168" Apr 16 18:36:42.922214 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.922191 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerID="360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63" exitCode=0 Apr 16 18:36:42.922308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.922224 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerDied","Data":"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63"} Apr 16 18:36:42.928901 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.928885 2566 scope.go:117] "RemoveContainer" containerID="9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c" Apr 16 18:36:42.935634 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.935614 2566 scope.go:117] "RemoveContainer" containerID="8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168" Apr 16 18:36:42.935856 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:36:42.935836 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168\": container with ID starting with 8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168 not found: ID does not exist" containerID="8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168" Apr 16 18:36:42.935906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.935862 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168"} err="failed to get container status \"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168\": rpc error: code = NotFound desc = could not find container \"8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168\": container with ID starting with 8181eea06b5cd8c77f10d2a6e03794314151ab33e969f13824d2101161e6f168 not found: ID does not exist" Apr 16 18:36:42.935906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.935878 2566 scope.go:117] "RemoveContainer" containerID="9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c" Apr 16 18:36:42.936134 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:36:42.936118 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c\": container with ID starting with 9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c not found: ID does not exist" containerID="9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c" Apr 16 18:36:42.936179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.936140 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c"} err="failed to get container status \"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c\": rpc error: code = NotFound desc = could not find container \"9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c\": container with ID starting with 9d5def388d0c57a5164b770f6272edcdd48970b941133624657d1015619fc31c not found: ID does not exist" Apr 16 18:36:42.959831 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.959810 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:36:42.964080 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:42.964062 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-predictor-78c8d484d6-h655z"] Apr 16 18:36:43.927785 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:43.927753 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerStarted","Data":"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33"} Apr 16 18:36:43.928236 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:43.928035 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:36:43.929433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:43.929407 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:36:43.947648 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:43.947610 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podStartSLOduration=5.94759867 podStartE2EDuration="5.94759867s" podCreationTimestamp="2026-04-16 18:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:36:43.945586066 +0000 UTC m=+1180.037874043" watchObservedRunningTime="2026-04-16 18:36:43.94759867 +0000 UTC m=+1180.039886635" Apr 16 18:36:44.452569 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:44.452532 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" path="/var/lib/kubelet/pods/a0c1255b-aa08-432e-805b-bfedba0c115e/volumes" Apr 16 18:36:44.931389 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:44.931353 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:36:54.931599 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:36:54.931560 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:04.932308 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:04.932260 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:14.932400 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:14.932358 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:24.931698 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:24.931650 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:34.931490 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:34.931445 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:44.932192 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:44.932147 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:37:54.931589 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:37:54.931497 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.29:8080: connect: connection refused" Apr 16 18:38:04.932189 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:04.932155 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:38:08.545499 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.545463 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:38:08.545912 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.545840 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" containerID="cri-o://06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33" gracePeriod=30 Apr 16 18:38:08.625495 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625465 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:38:08.625804 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625791 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="storage-initializer" Apr 16 18:38:08.625853 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625806 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="storage-initializer" Apr 16 18:38:08.625853 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625828 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" Apr 16 18:38:08.625853 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625835 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" Apr 16 18:38:08.625951 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.625882 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a0c1255b-aa08-432e-805b-bfedba0c115e" containerName="kserve-container" Apr 16 18:38:08.628538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.628520 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:38:08.637591 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.637567 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:38:08.723324 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.723295 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg\" (UID: \"6bf4f9c1-6464-40de-afb6-cb1323df86f1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:38:08.823677 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.823589 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg\" (UID: \"6bf4f9c1-6464-40de-afb6-cb1323df86f1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:38:08.823957 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.823931 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location\") pod \"isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg\" (UID: \"6bf4f9c1-6464-40de-afb6-cb1323df86f1\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:38:08.938606 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:08.938576 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:38:09.066223 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:09.066194 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:38:09.068749 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:38:09.068725 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bf4f9c1_6464_40de_afb6_cb1323df86f1.slice/crio-873a6d5a668397d8da325f039515a9c580334bf7c2338a6b9d9ad6391e85391a WatchSource:0}: Error finding container 873a6d5a668397d8da325f039515a9c580334bf7c2338a6b9d9ad6391e85391a: Status 404 returned error can't find the container with id 873a6d5a668397d8da325f039515a9c580334bf7c2338a6b9d9ad6391e85391a Apr 16 18:38:09.180361 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:09.180324 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerStarted","Data":"a6af1e253caf2e908348aea5c0e6704e9093358cf5e6d436ba9895f6d20915c4"} Apr 16 18:38:09.180361 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:09.180365 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerStarted","Data":"873a6d5a668397d8da325f039515a9c580334bf7c2338a6b9d9ad6391e85391a"} Apr 16 18:38:13.085201 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.085177 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:38:13.156738 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.156674 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location\") pod \"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef\" (UID: \"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef\") " Apr 16 18:38:13.156984 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.156964 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" (UID: "aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:38:13.192652 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.192623 2566 generic.go:358] "Generic (PLEG): container finished" podID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerID="06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33" exitCode=0 Apr 16 18:38:13.192773 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.192686 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" Apr 16 18:38:13.192773 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.192692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerDied","Data":"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33"} Apr 16 18:38:13.192773 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.192721 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc" event={"ID":"aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef","Type":"ContainerDied","Data":"13bc9f4e4d1b4050ed8636b04b4c929bab8c0425528096a35885c94de7265c64"} Apr 16 18:38:13.192773 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.192742 2566 scope.go:117] "RemoveContainer" containerID="06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33" Apr 16 18:38:13.194154 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.194129 2566 generic.go:358] "Generic (PLEG): container finished" podID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerID="a6af1e253caf2e908348aea5c0e6704e9093358cf5e6d436ba9895f6d20915c4" exitCode=0 Apr 16 18:38:13.194251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.194172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerDied","Data":"a6af1e253caf2e908348aea5c0e6704e9093358cf5e6d436ba9895f6d20915c4"} Apr 16 18:38:13.201006 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.200975 2566 scope.go:117] "RemoveContainer" containerID="360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63" Apr 16 18:38:13.207521 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.207507 2566 scope.go:117] "RemoveContainer" containerID="06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33" Apr 16 18:38:13.207753 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:38:13.207736 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33\": container with ID starting with 06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33 not found: ID does not exist" containerID="06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33" Apr 16 18:38:13.207796 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.207764 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33"} err="failed to get container status \"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33\": rpc error: code = NotFound desc = could not find container \"06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33\": container with ID starting with 06f546a56b262ceb05319f97b33720ed137d5f88a9ed22dac29517dc1ad5ee33 not found: ID does not exist" Apr 16 18:38:13.207796 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.207782 2566 scope.go:117] "RemoveContainer" containerID="360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63" Apr 16 18:38:13.208002 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:38:13.207971 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63\": container with ID starting with 360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63 not found: ID does not exist" containerID="360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63" Apr 16 18:38:13.208050 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.208017 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63"} err="failed to get container status \"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63\": rpc error: code = NotFound desc = could not find container \"360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63\": container with ID starting with 360a66f6e6e5f040a166c3d88caf20e1cf85b13d5baf6caae64032b4471f6a63 not found: ID does not exist" Apr 16 18:38:13.228279 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.228255 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:38:13.236009 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.235078 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-runtime-predictor-64984c7cb-tn9lc"] Apr 16 18:38:13.258056 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:13.258025 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:38:14.454793 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:38:14.454757 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" path="/var/lib/kubelet/pods/aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef/volumes" Apr 16 18:40:34.663931 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:40:34.663892 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerStarted","Data":"979460c3b4f99e8cf5aeb749943575ace947af161856ab8c6b67db4417c00b65"} Apr 16 18:40:34.664415 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:40:34.664113 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:40:34.693231 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:40:34.693181 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" podStartSLOduration=6.227048838 podStartE2EDuration="2m26.693168979s" podCreationTimestamp="2026-04-16 18:38:08 +0000 UTC" firstStartedPulling="2026-04-16 18:38:13.195319504 +0000 UTC m=+1269.287607448" lastFinishedPulling="2026-04-16 18:40:33.661439643 +0000 UTC m=+1409.753727589" observedRunningTime="2026-04-16 18:40:34.692313424 +0000 UTC m=+1410.784601389" watchObservedRunningTime="2026-04-16 18:40:34.693168979 +0000 UTC m=+1410.785456945" Apr 16 18:41:05.672780 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:05.672695 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:41:08.816836 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.816802 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:41:08.817325 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.817229 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="kserve-container" containerID="cri-o://979460c3b4f99e8cf5aeb749943575ace947af161856ab8c6b67db4417c00b65" gracePeriod=30 Apr 16 18:41:08.902328 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902294 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:08.902701 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902688 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" Apr 16 18:41:08.902744 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902704 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" Apr 16 18:41:08.902744 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902713 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="storage-initializer" Apr 16 18:41:08.902744 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902720 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="storage-initializer" Apr 16 18:41:08.902845 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.902789 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="aa83d3c0-9447-4bd7-8f2b-545f2e3ecdef" containerName="kserve-container" Apr 16 18:41:08.923898 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.923867 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:08.924063 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:08.924006 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:09.035229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.035193 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b\" (UID: \"196370fa-fe37-448a-b863-3ae1ecd9d670\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:09.136181 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.136148 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b\" (UID: \"196370fa-fe37-448a-b863-3ae1ecd9d670\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:09.136507 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.136487 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location\") pod \"isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b\" (UID: \"196370fa-fe37-448a-b863-3ae1ecd9d670\") " pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:09.235719 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.235690 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:09.357799 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.357750 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:09.361110 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:41:09.361073 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod196370fa_fe37_448a_b863_3ae1ecd9d670.slice/crio-d5606ee85fcbba496a04dea412238f6535f8e9c1026808b7975bf4227d04be35 WatchSource:0}: Error finding container d5606ee85fcbba496a04dea412238f6535f8e9c1026808b7975bf4227d04be35: Status 404 returned error can't find the container with id d5606ee85fcbba496a04dea412238f6535f8e9c1026808b7975bf4227d04be35 Apr 16 18:41:09.366144 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.366123 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:41:09.771796 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.771719 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerStarted","Data":"fda84c930de16f22746e0c78244ccfacbe7717e1e2802ac7f7aabaff9e6f2537"} Apr 16 18:41:09.771796 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.771772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerStarted","Data":"d5606ee85fcbba496a04dea412238f6535f8e9c1026808b7975bf4227d04be35"} Apr 16 18:41:09.773813 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.773787 2566 generic.go:358] "Generic (PLEG): container finished" podID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerID="979460c3b4f99e8cf5aeb749943575ace947af161856ab8c6b67db4417c00b65" exitCode=0 Apr 16 18:41:09.773930 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.773820 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerDied","Data":"979460c3b4f99e8cf5aeb749943575ace947af161856ab8c6b67db4417c00b65"} Apr 16 18:41:09.869311 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:09.869288 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:41:10.045686 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.045602 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location\") pod \"6bf4f9c1-6464-40de-afb6-cb1323df86f1\" (UID: \"6bf4f9c1-6464-40de-afb6-cb1323df86f1\") " Apr 16 18:41:10.045913 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.045890 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6bf4f9c1-6464-40de-afb6-cb1323df86f1" (UID: "6bf4f9c1-6464-40de-afb6-cb1323df86f1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:10.147055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.147018 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6bf4f9c1-6464-40de-afb6-cb1323df86f1-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:41:10.779409 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.779337 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" Apr 16 18:41:10.779535 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.779336 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg" event={"ID":"6bf4f9c1-6464-40de-afb6-cb1323df86f1","Type":"ContainerDied","Data":"873a6d5a668397d8da325f039515a9c580334bf7c2338a6b9d9ad6391e85391a"} Apr 16 18:41:10.779535 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.779455 2566 scope.go:117] "RemoveContainer" containerID="979460c3b4f99e8cf5aeb749943575ace947af161856ab8c6b67db4417c00b65" Apr 16 18:41:10.787338 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.787318 2566 scope.go:117] "RemoveContainer" containerID="a6af1e253caf2e908348aea5c0e6704e9093358cf5e6d436ba9895f6d20915c4" Apr 16 18:41:10.802682 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.802661 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:41:10.804214 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:10.804194 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-runtime-predictor-8d44c64dc-dhkwg"] Apr 16 18:41:12.451891 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:12.451860 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" path="/var/lib/kubelet/pods/6bf4f9c1-6464-40de-afb6-cb1323df86f1/volumes" Apr 16 18:41:13.791114 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:13.791082 2566 generic.go:358] "Generic (PLEG): container finished" podID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerID="fda84c930de16f22746e0c78244ccfacbe7717e1e2802ac7f7aabaff9e6f2537" exitCode=0 Apr 16 18:41:13.791483 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:13.791156 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerDied","Data":"fda84c930de16f22746e0c78244ccfacbe7717e1e2802ac7f7aabaff9e6f2537"} Apr 16 18:41:14.795776 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:14.795745 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerStarted","Data":"7ac21c1d78d1d7eeda7426e6bd07b663eb63f9f88f9be64dbdd8d9ac423a752f"} Apr 16 18:41:14.796180 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:14.796041 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:14.797414 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:14.797382 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 18:41:14.817195 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:14.817147 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" podStartSLOduration=6.817134555 podStartE2EDuration="6.817134555s" podCreationTimestamp="2026-04-16 18:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:14.815404139 +0000 UTC m=+1450.907692106" watchObservedRunningTime="2026-04-16 18:41:14.817134555 +0000 UTC m=+1450.909422519" Apr 16 18:41:15.799167 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:15.799132 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.31:8080: connect: connection refused" Apr 16 18:41:25.801045 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:25.801013 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:29.030314 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.030277 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:29.030751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.030539 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" containerID="cri-o://7ac21c1d78d1d7eeda7426e6bd07b663eb63f9f88f9be64dbdd8d9ac423a752f" gracePeriod=30 Apr 16 18:41:29.145720 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.145679 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:41:29.146128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.146111 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="storage-initializer" Apr 16 18:41:29.146230 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.146131 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="storage-initializer" Apr 16 18:41:29.146230 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.146151 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="kserve-container" Apr 16 18:41:29.146230 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.146160 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="kserve-container" Apr 16 18:41:29.146387 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.146248 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bf4f9c1-6464-40de-afb6-cb1323df86f1" containerName="kserve-container" Apr 16 18:41:29.149761 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.149739 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:29.163122 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.163097 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:41:29.208096 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.208064 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn\" (UID: \"f8e322a2-3594-4ce4-8cfc-8e23125160b6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:29.308934 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.308822 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn\" (UID: \"f8e322a2-3594-4ce4-8cfc-8e23125160b6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:29.309249 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.309228 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location\") pod \"isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn\" (UID: \"f8e322a2-3594-4ce4-8cfc-8e23125160b6\") " pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:29.459768 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.459734 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:29.589800 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.589771 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:41:29.620832 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:41:29.620800 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e322a2_3594_4ce4_8cfc_8e23125160b6.slice/crio-c0c547141d4ef533686a9718811778b194b40fcd33a483fffbebb76a96046bc3 WatchSource:0}: Error finding container c0c547141d4ef533686a9718811778b194b40fcd33a483fffbebb76a96046bc3: Status 404 returned error can't find the container with id c0c547141d4ef533686a9718811778b194b40fcd33a483fffbebb76a96046bc3 Apr 16 18:41:29.851776 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.851743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerStarted","Data":"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102"} Apr 16 18:41:29.851952 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.851785 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerStarted","Data":"c0c547141d4ef533686a9718811778b194b40fcd33a483fffbebb76a96046bc3"} Apr 16 18:41:29.853388 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.853359 2566 generic.go:358] "Generic (PLEG): container finished" podID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerID="7ac21c1d78d1d7eeda7426e6bd07b663eb63f9f88f9be64dbdd8d9ac423a752f" exitCode=0 Apr 16 18:41:29.853489 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:29.853403 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerDied","Data":"7ac21c1d78d1d7eeda7426e6bd07b663eb63f9f88f9be64dbdd8d9ac423a752f"} Apr 16 18:41:30.166538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.166510 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:30.218083 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.218059 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location\") pod \"196370fa-fe37-448a-b863-3ae1ecd9d670\" (UID: \"196370fa-fe37-448a-b863-3ae1ecd9d670\") " Apr 16 18:41:30.218405 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.218379 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "196370fa-fe37-448a-b863-3ae1ecd9d670" (UID: "196370fa-fe37-448a-b863-3ae1ecd9d670"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:41:30.318705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.318672 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/196370fa-fe37-448a-b863-3ae1ecd9d670-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:41:30.858731 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.858705 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" Apr 16 18:41:30.858731 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.858723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b" event={"ID":"196370fa-fe37-448a-b863-3ae1ecd9d670","Type":"ContainerDied","Data":"d5606ee85fcbba496a04dea412238f6535f8e9c1026808b7975bf4227d04be35"} Apr 16 18:41:30.858941 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.858768 2566 scope.go:117] "RemoveContainer" containerID="7ac21c1d78d1d7eeda7426e6bd07b663eb63f9f88f9be64dbdd8d9ac423a752f" Apr 16 18:41:30.866468 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.866454 2566 scope.go:117] "RemoveContainer" containerID="fda84c930de16f22746e0c78244ccfacbe7717e1e2802ac7f7aabaff9e6f2537" Apr 16 18:41:30.878799 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.878777 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:30.883599 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:30.883580 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-lightgbm-v2-kserve-predictor-68fc4f5b49-n6t5b"] Apr 16 18:41:32.459313 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:32.459280 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" path="/var/lib/kubelet/pods/196370fa-fe37-448a-b863-3ae1ecd9d670/volumes" Apr 16 18:41:33.871751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:33.871666 2566 generic.go:358] "Generic (PLEG): container finished" podID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerID="2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102" exitCode=0 Apr 16 18:41:33.871751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:33.871734 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerDied","Data":"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102"} Apr 16 18:41:34.877806 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:34.877772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerStarted","Data":"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84"} Apr 16 18:41:34.878238 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:34.878019 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:41:34.899799 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:41:34.899747 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" podStartSLOduration=5.899728778 podStartE2EDuration="5.899728778s" podCreationTimestamp="2026-04-16 18:41:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:41:34.898978691 +0000 UTC m=+1470.991266654" watchObservedRunningTime="2026-04-16 18:41:34.899728778 +0000 UTC m=+1470.992016744" Apr 16 18:42:05.887721 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:05.887677 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:42:09.152949 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.152913 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:42:09.153398 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.153288 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="kserve-container" containerID="cri-o://580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84" gracePeriod=30 Apr 16 18:42:09.236465 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236430 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:42:09.236797 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236784 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" Apr 16 18:42:09.236846 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236798 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" Apr 16 18:42:09.236846 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236812 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="storage-initializer" Apr 16 18:42:09.236846 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236818 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="storage-initializer" Apr 16 18:42:09.236936 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.236893 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="196370fa-fe37-448a-b863-3ae1ecd9d670" containerName="kserve-container" Apr 16 18:42:09.241686 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.241662 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:09.248911 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.248884 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:42:09.343148 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.343098 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb\" (UID: \"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:09.444687 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.444583 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb\" (UID: \"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:09.444987 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.444965 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location\") pod \"isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb\" (UID: \"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0\") " pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:09.553698 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.553660 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:09.677385 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.677308 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:42:09.679768 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:42:09.679734 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfc19bc7_4d39_4e89_9849_9917fdd5d6d0.slice/crio-7ddcc307d95b5ad8fd82ad6b024c22c2612c58be4e327498d398202fd31c6c91 WatchSource:0}: Error finding container 7ddcc307d95b5ad8fd82ad6b024c22c2612c58be4e327498d398202fd31c6c91: Status 404 returned error can't find the container with id 7ddcc307d95b5ad8fd82ad6b024c22c2612c58be4e327498d398202fd31c6c91 Apr 16 18:42:09.994635 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.994534 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerStarted","Data":"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357"} Apr 16 18:42:09.994635 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:09.994582 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerStarted","Data":"7ddcc307d95b5ad8fd82ad6b024c22c2612c58be4e327498d398202fd31c6c91"} Apr 16 18:42:10.389558 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.389537 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:42:10.454024 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.453984 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location\") pod \"f8e322a2-3594-4ce4-8cfc-8e23125160b6\" (UID: \"f8e322a2-3594-4ce4-8cfc-8e23125160b6\") " Apr 16 18:42:10.454357 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.454334 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f8e322a2-3594-4ce4-8cfc-8e23125160b6" (UID: "f8e322a2-3594-4ce4-8cfc-8e23125160b6"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:42:10.555649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.555569 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8e322a2-3594-4ce4-8cfc-8e23125160b6-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:42:10.999237 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.999203 2566 generic.go:358] "Generic (PLEG): container finished" podID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerID="580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84" exitCode=0 Apr 16 18:42:10.999396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.999267 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" Apr 16 18:42:10.999396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.999288 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerDied","Data":"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84"} Apr 16 18:42:10.999396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.999327 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn" event={"ID":"f8e322a2-3594-4ce4-8cfc-8e23125160b6","Type":"ContainerDied","Data":"c0c547141d4ef533686a9718811778b194b40fcd33a483fffbebb76a96046bc3"} Apr 16 18:42:10.999396 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:10.999342 2566 scope.go:117] "RemoveContainer" containerID="580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84" Apr 16 18:42:11.007377 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.007361 2566 scope.go:117] "RemoveContainer" containerID="2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102" Apr 16 18:42:11.014171 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.014143 2566 scope.go:117] "RemoveContainer" containerID="580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84" Apr 16 18:42:11.014420 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:42:11.014403 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84\": container with ID starting with 580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84 not found: ID does not exist" containerID="580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84" Apr 16 18:42:11.014496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.014432 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84"} err="failed to get container status \"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84\": rpc error: code = NotFound desc = could not find container \"580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84\": container with ID starting with 580dad5d3dcfbcfb24327f51d5756bdfcdc92dbf4d471dfdc49c4fcd2fc9db84 not found: ID does not exist" Apr 16 18:42:11.014496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.014457 2566 scope.go:117] "RemoveContainer" containerID="2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102" Apr 16 18:42:11.014694 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:42:11.014675 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102\": container with ID starting with 2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102 not found: ID does not exist" containerID="2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102" Apr 16 18:42:11.014749 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.014702 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102"} err="failed to get container status \"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102\": rpc error: code = NotFound desc = could not find container \"2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102\": container with ID starting with 2082b50a677cce28b54cef40b50fb4ba8984b0a132194980369cad01fe5e1102 not found: ID does not exist" Apr 16 18:42:11.020922 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.020899 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:42:11.025375 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:11.025353 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-mlflow-v2-runtime-predictor-5497fc5f68-l6mzn"] Apr 16 18:42:12.452515 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:12.452474 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" path="/var/lib/kubelet/pods/f8e322a2-3594-4ce4-8cfc-8e23125160b6/volumes" Apr 16 18:42:14.010382 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:14.010347 2566 generic.go:358] "Generic (PLEG): container finished" podID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerID="0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357" exitCode=0 Apr 16 18:42:14.010779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:14.010396 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerDied","Data":"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357"} Apr 16 18:42:15.014751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:15.014714 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerStarted","Data":"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036"} Apr 16 18:42:17.025970 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:17.025883 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerStarted","Data":"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb"} Apr 16 18:42:17.026387 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:17.026019 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:17.056114 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:17.056046 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podStartSLOduration=5.37876674 podStartE2EDuration="8.056027531s" podCreationTimestamp="2026-04-16 18:42:09 +0000 UTC" firstStartedPulling="2026-04-16 18:42:14.066067476 +0000 UTC m=+1510.158355422" lastFinishedPulling="2026-04-16 18:42:16.74332827 +0000 UTC m=+1512.835616213" observedRunningTime="2026-04-16 18:42:17.053495865 +0000 UTC m=+1513.145783832" watchObservedRunningTime="2026-04-16 18:42:17.056027531 +0000 UTC m=+1513.148315498" Apr 16 18:42:18.029630 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:18.029597 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:42:49.035782 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:42:49.035700 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:43:19.037542 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.037511 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:43:19.314740 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.314658 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:43:19.315018 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.314936 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" containerID="cri-o://88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036" gracePeriod=30 Apr 16 18:43:19.315093 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.314977 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-agent" containerID="cri-o://112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb" gracePeriod=30 Apr 16 18:43:19.380818 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.380785 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:43:19.381226 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.381209 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="kserve-container" Apr 16 18:43:19.381226 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.381227 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="kserve-container" Apr 16 18:43:19.381340 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.381247 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="storage-initializer" Apr 16 18:43:19.381340 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.381253 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="storage-initializer" Apr 16 18:43:19.381340 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.381322 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e322a2-3594-4ce4-8cfc-8e23125160b6" containerName="kserve-container" Apr 16 18:43:19.385474 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.385457 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:19.394021 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.393981 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:43:19.429214 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.429185 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zvvw7\" (UID: \"34aac4d2-f3c1-4dc3-989f-9348ce3eee19\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:19.529714 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.529678 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zvvw7\" (UID: \"34aac4d2-f3c1-4dc3-989f-9348ce3eee19\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:19.530085 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.530067 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location\") pod \"isvc-paddle-predictor-7dddcb4bd4-zvvw7\" (UID: \"34aac4d2-f3c1-4dc3-989f-9348ce3eee19\") " pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:19.695620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.695595 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:19.819189 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:19.819164 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:43:19.821416 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:43:19.821382 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34aac4d2_f3c1_4dc3_989f_9348ce3eee19.slice/crio-a6f7decdfdd0e6cd19454735751492317625a38ba12ea1e20dc50c4cb54bc1e6 WatchSource:0}: Error finding container a6f7decdfdd0e6cd19454735751492317625a38ba12ea1e20dc50c4cb54bc1e6: Status 404 returned error can't find the container with id a6f7decdfdd0e6cd19454735751492317625a38ba12ea1e20dc50c4cb54bc1e6 Apr 16 18:43:20.214494 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:20.214451 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerStarted","Data":"f9a9f56ad2cb41c84ed27dd89ea6e7c10d3a7ffb06ec64dc06e20a07140c8174"} Apr 16 18:43:20.214494 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:20.214494 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerStarted","Data":"a6f7decdfdd0e6cd19454735751492317625a38ba12ea1e20dc50c4cb54bc1e6"} Apr 16 18:43:22.222219 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:22.222188 2566 generic.go:358] "Generic (PLEG): container finished" podID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerID="88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036" exitCode=0 Apr 16 18:43:22.222219 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:22.222228 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerDied","Data":"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036"} Apr 16 18:43:25.233144 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:25.233104 2566 generic.go:358] "Generic (PLEG): container finished" podID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerID="f9a9f56ad2cb41c84ed27dd89ea6e7c10d3a7ffb06ec64dc06e20a07140c8174" exitCode=0 Apr 16 18:43:25.233534 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:25.233184 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerDied","Data":"f9a9f56ad2cb41c84ed27dd89ea6e7c10d3a7ffb06ec64dc06e20a07140c8174"} Apr 16 18:43:29.033409 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:29.033363 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 18:43:37.273081 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:37.273045 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerStarted","Data":"bad75225735b1c45d7133b99e13d30d46c5e043074244746bf79727b5eb761ef"} Apr 16 18:43:37.273493 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:37.273344 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:43:37.274656 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:37.274628 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:43:37.291827 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:37.291783 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podStartSLOduration=7.252252156 podStartE2EDuration="18.291770419s" podCreationTimestamp="2026-04-16 18:43:19 +0000 UTC" firstStartedPulling="2026-04-16 18:43:25.234289078 +0000 UTC m=+1581.326577021" lastFinishedPulling="2026-04-16 18:43:36.27380734 +0000 UTC m=+1592.366095284" observedRunningTime="2026-04-16 18:43:37.290260099 +0000 UTC m=+1593.382548066" watchObservedRunningTime="2026-04-16 18:43:37.291770419 +0000 UTC m=+1593.384058427" Apr 16 18:43:38.276013 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:38.275961 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:43:39.033538 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:39.033497 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 18:43:48.276720 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:48.276675 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:43:49.033168 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.033120 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.33:8080/v1/models/isvc-sklearn-mcp\": dial tcp 10.132.0.33:8080: connect: connection refused" Apr 16 18:43:49.033334 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.033243 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:43:49.458748 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.458724 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:43:49.492112 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.492084 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location\") pod \"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0\" (UID: \"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0\") " Apr 16 18:43:49.492389 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.492369 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" (UID: "dfc19bc7-4d39-4e89-9849-9917fdd5d6d0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:43:49.593208 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:49.593170 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:43:50.311537 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.311503 2566 generic.go:358] "Generic (PLEG): container finished" podID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerID="112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb" exitCode=0 Apr 16 18:43:50.311728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.311575 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" Apr 16 18:43:50.311728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.311579 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerDied","Data":"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb"} Apr 16 18:43:50.311728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.311618 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb" event={"ID":"dfc19bc7-4d39-4e89-9849-9917fdd5d6d0","Type":"ContainerDied","Data":"7ddcc307d95b5ad8fd82ad6b024c22c2612c58be4e327498d398202fd31c6c91"} Apr 16 18:43:50.311728 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.311639 2566 scope.go:117] "RemoveContainer" containerID="112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb" Apr 16 18:43:50.319668 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.319507 2566 scope.go:117] "RemoveContainer" containerID="88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036" Apr 16 18:43:50.326462 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.326445 2566 scope.go:117] "RemoveContainer" containerID="0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357" Apr 16 18:43:50.334243 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334224 2566 scope.go:117] "RemoveContainer" containerID="112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb" Apr 16 18:43:50.334497 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:43:50.334476 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb\": container with ID starting with 112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb not found: ID does not exist" containerID="112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb" Apr 16 18:43:50.334550 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334509 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb"} err="failed to get container status \"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb\": rpc error: code = NotFound desc = could not find container \"112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb\": container with ID starting with 112c37ee042b6899ec1afd8ea5d234f9710937169a58913953bc3f63cd431bdb not found: ID does not exist" Apr 16 18:43:50.334550 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334532 2566 scope.go:117] "RemoveContainer" containerID="88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036" Apr 16 18:43:50.334765 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:43:50.334748 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036\": container with ID starting with 88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036 not found: ID does not exist" containerID="88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036" Apr 16 18:43:50.334809 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334771 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036"} err="failed to get container status \"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036\": rpc error: code = NotFound desc = could not find container \"88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036\": container with ID starting with 88237ccb5fd44a3eae1404cab4e027b3015721300bb0ea53b7ab0b56f0652036 not found: ID does not exist" Apr 16 18:43:50.334809 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334800 2566 scope.go:117] "RemoveContainer" containerID="0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357" Apr 16 18:43:50.334890 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.334866 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:43:50.335073 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:43:50.335054 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357\": container with ID starting with 0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357 not found: ID does not exist" containerID="0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357" Apr 16 18:43:50.335130 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.335078 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357"} err="failed to get container status \"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357\": rpc error: code = NotFound desc = could not find container \"0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357\": container with ID starting with 0cf28d7ec2669e7e964ae9dc2bca2be8a8a1731c67831f39d3de311411172357 not found: ID does not exist" Apr 16 18:43:50.338828 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.338805 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-mcp-predictor-6bb8d86b55-r6xzb"] Apr 16 18:43:50.453567 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:50.453531 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" path="/var/lib/kubelet/pods/dfc19bc7-4d39-4e89-9849-9917fdd5d6d0/volumes" Apr 16 18:43:58.276650 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:43:58.276565 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:44:08.276137 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:08.276095 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:44:18.275933 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:18.275894 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.34:8080: connect: connection refused" Apr 16 18:44:28.277019 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:28.276963 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:44:30.933068 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.933038 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:44:30.933531 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.933281 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" containerID="cri-o://bad75225735b1c45d7133b99e13d30d46c5e043074244746bf79727b5eb761ef" gracePeriod=30 Apr 16 18:44:30.986095 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.986061 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:44:30.986945 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.986913 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="storage-initializer" Apr 16 18:44:30.987104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.986951 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="storage-initializer" Apr 16 18:44:30.987104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987006 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-agent" Apr 16 18:44:30.987104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987016 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-agent" Apr 16 18:44:30.987104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987029 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" Apr 16 18:44:30.987104 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987038 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" Apr 16 18:44:30.987345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987231 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-agent" Apr 16 18:44:30.987345 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:30.987253 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="dfc19bc7-4d39-4e89-9849-9917fdd5d6d0" containerName="kserve-container" Apr 16 18:44:31.003585 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.003546 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:44:31.003748 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.003662 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:31.151887 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.151852 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-6rtwv\" (UID: \"34cfd785-7953-43f2-9968-483dd6f3b17f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:31.252732 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.252651 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-6rtwv\" (UID: \"34cfd785-7953-43f2-9968-483dd6f3b17f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:31.253027 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.252984 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location\") pod \"isvc-paddle-runtime-predictor-86b49c4466-6rtwv\" (UID: \"34cfd785-7953-43f2-9968-483dd6f3b17f\") " pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:31.313631 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.313605 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:31.432567 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.432541 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:44:31.435186 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:44:31.435153 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34cfd785_7953_43f2_9968_483dd6f3b17f.slice/crio-169497f08a421c8c0aead534b43de6b6b18bac018c150ac8bcc2b87279f21aaa WatchSource:0}: Error finding container 169497f08a421c8c0aead534b43de6b6b18bac018c150ac8bcc2b87279f21aaa: Status 404 returned error can't find the container with id 169497f08a421c8c0aead534b43de6b6b18bac018c150ac8bcc2b87279f21aaa Apr 16 18:44:31.440638 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:31.440607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerStarted","Data":"169497f08a421c8c0aead534b43de6b6b18bac018c150ac8bcc2b87279f21aaa"} Apr 16 18:44:32.445403 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:32.445368 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerStarted","Data":"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0"} Apr 16 18:44:33.450673 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.450648 2566 generic.go:358] "Generic (PLEG): container finished" podID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerID="bad75225735b1c45d7133b99e13d30d46c5e043074244746bf79727b5eb761ef" exitCode=0 Apr 16 18:44:33.450975 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.450723 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerDied","Data":"bad75225735b1c45d7133b99e13d30d46c5e043074244746bf79727b5eb761ef"} Apr 16 18:44:33.572667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.572647 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:44:33.671344 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.671252 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location\") pod \"34aac4d2-f3c1-4dc3-989f-9348ce3eee19\" (UID: \"34aac4d2-f3c1-4dc3-989f-9348ce3eee19\") " Apr 16 18:44:33.680260 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.680224 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34aac4d2-f3c1-4dc3-989f-9348ce3eee19" (UID: "34aac4d2-f3c1-4dc3-989f-9348ce3eee19"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:44:33.772449 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:33.772423 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34aac4d2-f3c1-4dc3-989f-9348ce3eee19-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:44:34.455821 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.455779 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" event={"ID":"34aac4d2-f3c1-4dc3-989f-9348ce3eee19","Type":"ContainerDied","Data":"a6f7decdfdd0e6cd19454735751492317625a38ba12ea1e20dc50c4cb54bc1e6"} Apr 16 18:44:34.455821 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.455816 2566 scope.go:117] "RemoveContainer" containerID="bad75225735b1c45d7133b99e13d30d46c5e043074244746bf79727b5eb761ef" Apr 16 18:44:34.455821 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.455818 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7" Apr 16 18:44:34.464431 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.464414 2566 scope.go:117] "RemoveContainer" containerID="f9a9f56ad2cb41c84ed27dd89ea6e7c10d3a7ffb06ec64dc06e20a07140c8174" Apr 16 18:44:34.478550 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.478529 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:44:34.483050 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:34.483027 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-predictor-7dddcb4bd4-zvvw7"] Apr 16 18:44:36.451775 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:36.451741 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" path="/var/lib/kubelet/pods/34aac4d2-f3c1-4dc3-989f-9348ce3eee19/volumes" Apr 16 18:44:36.464618 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:36.464588 2566 generic.go:358] "Generic (PLEG): container finished" podID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerID="44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0" exitCode=0 Apr 16 18:44:36.464771 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:36.464663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerDied","Data":"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0"} Apr 16 18:44:37.469380 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:37.469348 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerStarted","Data":"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1"} Apr 16 18:44:37.469743 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:37.469642 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:44:37.470935 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:37.470912 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 18:44:37.492298 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:37.492259 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podStartSLOduration=7.492246832 podStartE2EDuration="7.492246832s" podCreationTimestamp="2026-04-16 18:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:44:37.490925856 +0000 UTC m=+1653.583213822" watchObservedRunningTime="2026-04-16 18:44:37.492246832 +0000 UTC m=+1653.584534797" Apr 16 18:44:38.472184 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:38.472146 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 18:44:48.472983 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:48.472939 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 18:44:58.472269 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:44:58.472227 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 18:45:08.472768 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:08.472729 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.35:8080: connect: connection refused" Apr 16 18:45:18.474123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:18.474095 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:45:22.529773 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.529738 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:45:22.530190 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.530079 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" containerID="cri-o://33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1" gracePeriod=30 Apr 16 18:45:22.588203 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588172 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:45:22.588540 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588527 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="storage-initializer" Apr 16 18:45:22.588587 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588542 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="storage-initializer" Apr 16 18:45:22.588587 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588556 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" Apr 16 18:45:22.588587 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588561 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" Apr 16 18:45:22.588679 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.588617 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="34aac4d2-f3c1-4dc3-989f-9348ce3eee19" containerName="kserve-container" Apr 16 18:45:22.591719 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.591699 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:22.600784 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.600759 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:45:22.680578 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.680541 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng\" (UID: \"804a19e4-fa4a-4ae5-a77e-014e307040da\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:22.781179 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.781092 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng\" (UID: \"804a19e4-fa4a-4ae5-a77e-014e307040da\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:22.781464 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.781437 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location\") pod \"isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng\" (UID: \"804a19e4-fa4a-4ae5-a77e-014e307040da\") " pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:22.902865 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:22.902816 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:23.024654 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:23.024625 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:45:23.027687 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:45:23.027662 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod804a19e4_fa4a_4ae5_a77e_014e307040da.slice/crio-85cccd12e87160e8684b33d5c67eac76af8dad3c0447b6b8febbc9c157278afd WatchSource:0}: Error finding container 85cccd12e87160e8684b33d5c67eac76af8dad3c0447b6b8febbc9c157278afd: Status 404 returned error can't find the container with id 85cccd12e87160e8684b33d5c67eac76af8dad3c0447b6b8febbc9c157278afd Apr 16 18:45:23.609620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:23.609536 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerStarted","Data":"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85"} Apr 16 18:45:23.609620 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:23.609583 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerStarted","Data":"85cccd12e87160e8684b33d5c67eac76af8dad3c0447b6b8febbc9c157278afd"} Apr 16 18:45:25.263667 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.263645 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:45:25.399922 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.399895 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location\") pod \"34cfd785-7953-43f2-9968-483dd6f3b17f\" (UID: \"34cfd785-7953-43f2-9968-483dd6f3b17f\") " Apr 16 18:45:25.409634 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.409610 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "34cfd785-7953-43f2-9968-483dd6f3b17f" (UID: "34cfd785-7953-43f2-9968-483dd6f3b17f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:45:25.501191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.501161 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/34cfd785-7953-43f2-9968-483dd6f3b17f-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:45:25.619034 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.618978 2566 generic.go:358] "Generic (PLEG): container finished" podID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerID="33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1" exitCode=0 Apr 16 18:45:25.619034 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.619023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerDied","Data":"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1"} Apr 16 18:45:25.619229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.619061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" event={"ID":"34cfd785-7953-43f2-9968-483dd6f3b17f","Type":"ContainerDied","Data":"169497f08a421c8c0aead534b43de6b6b18bac018c150ac8bcc2b87279f21aaa"} Apr 16 18:45:25.619229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.619064 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv" Apr 16 18:45:25.619229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.619078 2566 scope.go:117] "RemoveContainer" containerID="33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1" Apr 16 18:45:25.627423 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.627408 2566 scope.go:117] "RemoveContainer" containerID="44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0" Apr 16 18:45:25.634562 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.634371 2566 scope.go:117] "RemoveContainer" containerID="33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1" Apr 16 18:45:25.634704 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:45:25.634613 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1\": container with ID starting with 33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1 not found: ID does not exist" containerID="33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1" Apr 16 18:45:25.634704 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.634636 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1"} err="failed to get container status \"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1\": rpc error: code = NotFound desc = could not find container \"33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1\": container with ID starting with 33e61675a8566366b7a920636ac1a6f3f5eabeaa8835eae5c23769eafdaf1bf1 not found: ID does not exist" Apr 16 18:45:25.634704 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.634653 2566 scope.go:117] "RemoveContainer" containerID="44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0" Apr 16 18:45:25.634844 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:45:25.634821 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0\": container with ID starting with 44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0 not found: ID does not exist" containerID="44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0" Apr 16 18:45:25.634844 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.634836 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0"} err="failed to get container status \"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0\": rpc error: code = NotFound desc = could not find container \"44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0\": container with ID starting with 44307408b7b1eb7c6468fac28d6ea7c50488cfb95991cbc5b83638ef8a28eff0 not found: ID does not exist" Apr 16 18:45:25.642543 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.642520 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:45:25.646835 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:25.646814 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-runtime-predictor-86b49c4466-6rtwv"] Apr 16 18:45:26.452528 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:26.452498 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" path="/var/lib/kubelet/pods/34cfd785-7953-43f2-9968-483dd6f3b17f/volumes" Apr 16 18:45:27.628086 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:27.628051 2566 generic.go:358] "Generic (PLEG): container finished" podID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerID="d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85" exitCode=0 Apr 16 18:45:27.628463 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:27.628125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerDied","Data":"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85"} Apr 16 18:45:28.633023 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:28.632973 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerStarted","Data":"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb"} Apr 16 18:45:28.633408 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:28.633326 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:45:28.634703 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:28.634674 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 18:45:28.650669 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:28.650627 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podStartSLOduration=6.650616112 podStartE2EDuration="6.650616112s" podCreationTimestamp="2026-04-16 18:45:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:45:28.650223918 +0000 UTC m=+1704.742511884" watchObservedRunningTime="2026-04-16 18:45:28.650616112 +0000 UTC m=+1704.742904071" Apr 16 18:45:29.637056 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:29.637021 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 18:45:39.637770 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:39.637729 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 18:45:49.637901 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:49.637865 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 18:45:59.637890 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:45:59.637843 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.36:8080: connect: connection refused" Apr 16 18:46:09.638198 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:09.638160 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:46:14.339838 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.339803 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:46:14.340257 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.340083 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" containerID="cri-o://76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb" gracePeriod=30 Apr 16 18:46:14.409468 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409435 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:46:14.409764 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409753 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" Apr 16 18:46:14.409815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409765 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" Apr 16 18:46:14.409815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409779 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="storage-initializer" Apr 16 18:46:14.409815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409785 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="storage-initializer" Apr 16 18:46:14.409908 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.409847 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="34cfd785-7953-43f2-9968-483dd6f3b17f" containerName="kserve-container" Apr 16 18:46:14.412905 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.412886 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:14.420813 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.420787 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:46:14.598971 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.598879 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-99nbf\" (UID: \"9240f277-c91a-434b-a1b3-1786a641bc8a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:14.699883 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.699848 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-99nbf\" (UID: \"9240f277-c91a-434b-a1b3-1786a641bc8a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:14.700223 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.700203 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location\") pod \"isvc-pmml-predictor-5584ffd8c9-99nbf\" (UID: \"9240f277-c91a-434b-a1b3-1786a641bc8a\") " pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:14.723487 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.723468 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:14.842819 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.842796 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:46:14.845106 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:46:14.845072 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9240f277_c91a_434b_a1b3_1786a641bc8a.slice/crio-88671fb723c218ea086483a10f85d1000a4c1f0dc7c3089891366d59c0f08f6d WatchSource:0}: Error finding container 88671fb723c218ea086483a10f85d1000a4c1f0dc7c3089891366d59c0f08f6d: Status 404 returned error can't find the container with id 88671fb723c218ea086483a10f85d1000a4c1f0dc7c3089891366d59c0f08f6d Apr 16 18:46:14.847385 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:14.847367 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:46:15.785729 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:15.785689 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerStarted","Data":"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7"} Apr 16 18:46:15.785729 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:15.785732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerStarted","Data":"88671fb723c218ea086483a10f85d1000a4c1f0dc7c3089891366d59c0f08f6d"} Apr 16 18:46:16.984358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:16.984332 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:46:17.121802 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.121770 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location\") pod \"804a19e4-fa4a-4ae5-a77e-014e307040da\" (UID: \"804a19e4-fa4a-4ae5-a77e-014e307040da\") " Apr 16 18:46:17.131719 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.131681 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "804a19e4-fa4a-4ae5-a77e-014e307040da" (UID: "804a19e4-fa4a-4ae5-a77e-014e307040da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:46:17.222496 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.222470 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/804a19e4-fa4a-4ae5-a77e-014e307040da-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:46:17.793184 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.793145 2566 generic.go:358] "Generic (PLEG): container finished" podID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerID="76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb" exitCode=0 Apr 16 18:46:17.793349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.793215 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" Apr 16 18:46:17.793349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.793240 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerDied","Data":"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb"} Apr 16 18:46:17.793349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.793272 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng" event={"ID":"804a19e4-fa4a-4ae5-a77e-014e307040da","Type":"ContainerDied","Data":"85cccd12e87160e8684b33d5c67eac76af8dad3c0447b6b8febbc9c157278afd"} Apr 16 18:46:17.793349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.793287 2566 scope.go:117] "RemoveContainer" containerID="76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb" Apr 16 18:46:17.808039 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.808019 2566 scope.go:117] "RemoveContainer" containerID="d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85" Apr 16 18:46:17.815090 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.815072 2566 scope.go:117] "RemoveContainer" containerID="76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb" Apr 16 18:46:17.815344 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:46:17.815324 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb\": container with ID starting with 76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb not found: ID does not exist" containerID="76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb" Apr 16 18:46:17.815394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.815354 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb"} err="failed to get container status \"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb\": rpc error: code = NotFound desc = could not find container \"76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb\": container with ID starting with 76552fd22e761bdf810f3c74b91bb4acb175752c19dc2f5709d439b2d9d181fb not found: ID does not exist" Apr 16 18:46:17.815394 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.815371 2566 scope.go:117] "RemoveContainer" containerID="d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85" Apr 16 18:46:17.815589 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:46:17.815568 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85\": container with ID starting with d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85 not found: ID does not exist" containerID="d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85" Apr 16 18:46:17.815632 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.815596 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85"} err="failed to get container status \"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85\": rpc error: code = NotFound desc = could not find container \"d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85\": container with ID starting with d4d414b27a44be7a797679ccfdffa3d5484b97a61ab5c92f9a5b54d69dafba85 not found: ID does not exist" Apr 16 18:46:17.819815 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.819796 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:46:17.823972 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:17.823952 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-paddle-v2-kserve-predictor-7c6bd76f7b-q2tng"] Apr 16 18:46:18.452778 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:18.452743 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" path="/var/lib/kubelet/pods/804a19e4-fa4a-4ae5-a77e-014e307040da/volumes" Apr 16 18:46:18.800659 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:18.800627 2566 generic.go:358] "Generic (PLEG): container finished" podID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerID="f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7" exitCode=0 Apr 16 18:46:18.800832 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:18.800707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerDied","Data":"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7"} Apr 16 18:46:25.826781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:25.826743 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerStarted","Data":"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df"} Apr 16 18:46:25.827216 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:25.827026 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:46:25.828414 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:25.828388 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:46:25.845067 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:25.845030 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podStartSLOduration=4.954087489 podStartE2EDuration="11.845017899s" podCreationTimestamp="2026-04-16 18:46:14 +0000 UTC" firstStartedPulling="2026-04-16 18:46:18.801851091 +0000 UTC m=+1754.894139034" lastFinishedPulling="2026-04-16 18:46:25.692781497 +0000 UTC m=+1761.785069444" observedRunningTime="2026-04-16 18:46:25.8440946 +0000 UTC m=+1761.936382577" watchObservedRunningTime="2026-04-16 18:46:25.845017899 +0000 UTC m=+1761.937305864" Apr 16 18:46:26.830820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:26.830780 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:46:36.831452 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:36.831410 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:46:46.830952 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:46.830906 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:46:56.830952 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:46:56.830856 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:06.830893 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:06.830848 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:16.831668 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:16.831621 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:26.831661 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:26.831617 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:36.831105 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:36.831055 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:40.453032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:40.452973 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.37:8080: connect: connection refused" Apr 16 18:47:50.453452 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:50.453415 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:47:55.442550 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.442513 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:47:55.443014 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.442830 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" containerID="cri-o://2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df" gracePeriod=30 Apr 16 18:47:55.509348 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509313 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:47:55.509653 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509641 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="storage-initializer" Apr 16 18:47:55.509699 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509654 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="storage-initializer" Apr 16 18:47:55.509699 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509688 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" Apr 16 18:47:55.509699 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509694 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" Apr 16 18:47:55.509792 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.509747 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="804a19e4-fa4a-4ae5-a77e-014e307040da" containerName="kserve-container" Apr 16 18:47:55.512880 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.512855 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:47:55.521343 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.521320 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:47:55.605589 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.605557 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wmsz5\" (UID: \"bf1c71ad-f082-47af-ac96-c09b63bf071f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:47:55.706933 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.706853 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wmsz5\" (UID: \"bf1c71ad-f082-47af-ac96-c09b63bf071f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:47:55.707239 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.707218 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location\") pod \"isvc-pmml-runtime-predictor-7576f6b69f-wmsz5\" (UID: \"bf1c71ad-f082-47af-ac96-c09b63bf071f\") " pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:47:55.824408 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.824369 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:47:55.943458 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:55.943435 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:47:55.946011 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:47:55.945968 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf1c71ad_f082_47af_ac96_c09b63bf071f.slice/crio-3c4208c7ebfb3ed12cc755e12bd39731f78e68654dbe0c4911d97e6b0ff2cd8c WatchSource:0}: Error finding container 3c4208c7ebfb3ed12cc755e12bd39731f78e68654dbe0c4911d97e6b0ff2cd8c: Status 404 returned error can't find the container with id 3c4208c7ebfb3ed12cc755e12bd39731f78e68654dbe0c4911d97e6b0ff2cd8c Apr 16 18:47:56.107861 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:56.107826 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerStarted","Data":"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20"} Apr 16 18:47:56.107861 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:56.107866 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerStarted","Data":"3c4208c7ebfb3ed12cc755e12bd39731f78e68654dbe0c4911d97e6b0ff2cd8c"} Apr 16 18:47:58.983778 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:58.983755 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:47:59.035568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.035494 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location\") pod \"9240f277-c91a-434b-a1b3-1786a641bc8a\" (UID: \"9240f277-c91a-434b-a1b3-1786a641bc8a\") " Apr 16 18:47:59.035841 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.035815 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9240f277-c91a-434b-a1b3-1786a641bc8a" (UID: "9240f277-c91a-434b-a1b3-1786a641bc8a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:47:59.117776 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.117738 2566 generic.go:358] "Generic (PLEG): container finished" podID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerID="2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df" exitCode=0 Apr 16 18:47:59.117906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.117800 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" Apr 16 18:47:59.117906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.117825 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerDied","Data":"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df"} Apr 16 18:47:59.117906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.117862 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf" event={"ID":"9240f277-c91a-434b-a1b3-1786a641bc8a","Type":"ContainerDied","Data":"88671fb723c218ea086483a10f85d1000a4c1f0dc7c3089891366d59c0f08f6d"} Apr 16 18:47:59.117906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.117876 2566 scope.go:117] "RemoveContainer" containerID="2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df" Apr 16 18:47:59.128169 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.127713 2566 scope.go:117] "RemoveContainer" containerID="f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7" Apr 16 18:47:59.135442 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.135420 2566 scope.go:117] "RemoveContainer" containerID="2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df" Apr 16 18:47:59.135693 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:47:59.135673 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df\": container with ID starting with 2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df not found: ID does not exist" containerID="2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df" Apr 16 18:47:59.135755 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.135702 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df"} err="failed to get container status \"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df\": rpc error: code = NotFound desc = could not find container \"2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df\": container with ID starting with 2f07a4c0844145b0de8128e37804c905a6e5c0f66702f97e830f89a0d8b5f2df not found: ID does not exist" Apr 16 18:47:59.135755 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.135720 2566 scope.go:117] "RemoveContainer" containerID="f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7" Apr 16 18:47:59.135941 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:47:59.135928 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7\": container with ID starting with f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7 not found: ID does not exist" containerID="f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7" Apr 16 18:47:59.136007 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.135944 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7"} err="failed to get container status \"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7\": rpc error: code = NotFound desc = could not find container \"f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7\": container with ID starting with f1061c2830a914ef2204262b0d915d3762868463eb28c76fa1f19563c40c70e7 not found: ID does not exist" Apr 16 18:47:59.136155 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.136138 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9240f277-c91a-434b-a1b3-1786a641bc8a-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:47:59.139623 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.139604 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:47:59.146507 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:47:59.146486 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-predictor-5584ffd8c9-99nbf"] Apr 16 18:48:00.123035 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:00.122984 2566 generic.go:358] "Generic (PLEG): container finished" podID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerID="1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20" exitCode=0 Apr 16 18:48:00.123412 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:00.123061 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerDied","Data":"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20"} Apr 16 18:48:00.451744 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:00.451664 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" path="/var/lib/kubelet/pods/9240f277-c91a-434b-a1b3-1786a641bc8a/volumes" Apr 16 18:48:01.128257 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:01.128219 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerStarted","Data":"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716"} Apr 16 18:48:01.128608 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:01.128504 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:48:01.129747 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:01.129723 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:01.148511 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:01.148468 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podStartSLOduration=6.14845509 podStartE2EDuration="6.14845509s" podCreationTimestamp="2026-04-16 18:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:48:01.147115369 +0000 UTC m=+1857.239403359" watchObservedRunningTime="2026-04-16 18:48:01.14845509 +0000 UTC m=+1857.240743056" Apr 16 18:48:02.131946 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:02.131910 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:12.132866 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:12.132823 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:22.132055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:22.131980 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:32.132757 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:32.132670 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:42.132617 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:42.132573 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:48:52.132487 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:48:52.132438 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:49:02.132885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:02.132840 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:49:12.132586 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:12.132539 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:49:16.448349 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:16.448308 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.38:8080: connect: connection refused" Apr 16 18:49:26.452346 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:26.452319 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:49:36.651033 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.650983 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:49:36.651517 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.651273 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" containerID="cri-o://be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716" gracePeriod=30 Apr 16 18:49:36.757950 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.757868 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:49:36.758236 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.758223 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="storage-initializer" Apr 16 18:49:36.758279 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.758238 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="storage-initializer" Apr 16 18:49:36.758279 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.758258 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" Apr 16 18:49:36.758279 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.758263 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" Apr 16 18:49:36.758371 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.758344 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="9240f277-c91a-434b-a1b3-1786a641bc8a" containerName="kserve-container" Apr 16 18:49:36.761295 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.761273 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:36.773520 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.773493 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:49:36.872359 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.872321 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf\" (UID: \"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:36.973223 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.973188 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf\" (UID: \"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:36.973554 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:36.973534 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location\") pod \"isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf\" (UID: \"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9\") " pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:37.071362 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:37.071241 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:37.191015 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:37.190838 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:49:37.193219 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:49:37.193189 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5569bdf2_4286_4ae9_8ffe_5f4fafbaaee9.slice/crio-d4db7c89ce6900757953c5d72c6ad4ed3fc4cecf1ad73b3c018c51ab9e80cc7e WatchSource:0}: Error finding container d4db7c89ce6900757953c5d72c6ad4ed3fc4cecf1ad73b3c018c51ab9e80cc7e: Status 404 returned error can't find the container with id d4db7c89ce6900757953c5d72c6ad4ed3fc4cecf1ad73b3c018c51ab9e80cc7e Apr 16 18:49:37.411393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:37.411356 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerStarted","Data":"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca"} Apr 16 18:49:37.411393 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:37.411396 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerStarted","Data":"d4db7c89ce6900757953c5d72c6ad4ed3fc4cecf1ad73b3c018c51ab9e80cc7e"} Apr 16 18:49:40.198385 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.198359 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:49:40.302649 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.302568 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location\") pod \"bf1c71ad-f082-47af-ac96-c09b63bf071f\" (UID: \"bf1c71ad-f082-47af-ac96-c09b63bf071f\") " Apr 16 18:49:40.302905 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.302884 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bf1c71ad-f082-47af-ac96-c09b63bf071f" (UID: "bf1c71ad-f082-47af-ac96-c09b63bf071f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:49:40.404016 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.403958 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bf1c71ad-f082-47af-ac96-c09b63bf071f-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:49:40.421621 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.421588 2566 generic.go:358] "Generic (PLEG): container finished" podID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerID="be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716" exitCode=0 Apr 16 18:49:40.421779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.421648 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" Apr 16 18:49:40.421779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.421674 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerDied","Data":"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716"} Apr 16 18:49:40.421779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.421724 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5" event={"ID":"bf1c71ad-f082-47af-ac96-c09b63bf071f","Type":"ContainerDied","Data":"3c4208c7ebfb3ed12cc755e12bd39731f78e68654dbe0c4911d97e6b0ff2cd8c"} Apr 16 18:49:40.421779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.421745 2566 scope.go:117] "RemoveContainer" containerID="be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716" Apr 16 18:49:40.430025 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.429979 2566 scope.go:117] "RemoveContainer" containerID="1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20" Apr 16 18:49:40.436959 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.436943 2566 scope.go:117] "RemoveContainer" containerID="be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716" Apr 16 18:49:40.437255 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:49:40.437235 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716\": container with ID starting with be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716 not found: ID does not exist" containerID="be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716" Apr 16 18:49:40.437311 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.437267 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716"} err="failed to get container status \"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716\": rpc error: code = NotFound desc = could not find container \"be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716\": container with ID starting with be9678b422c831cfd4c167e0a5c273ed460a2999cb8a1878c698ebbbea722716 not found: ID does not exist" Apr 16 18:49:40.437311 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.437287 2566 scope.go:117] "RemoveContainer" containerID="1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20" Apr 16 18:49:40.437535 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:49:40.437516 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20\": container with ID starting with 1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20 not found: ID does not exist" containerID="1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20" Apr 16 18:49:40.437575 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.437541 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20"} err="failed to get container status \"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20\": rpc error: code = NotFound desc = could not find container \"1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20\": container with ID starting with 1c79f4dd2f69e1103eef1b4e2a5e293e8cb661da1e8beb53273c4785cd612f20 not found: ID does not exist" Apr 16 18:49:40.444252 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.444227 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:49:40.446890 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.446870 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-runtime-predictor-7576f6b69f-wmsz5"] Apr 16 18:49:40.452444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:40.452424 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" path="/var/lib/kubelet/pods/bf1c71ad-f082-47af-ac96-c09b63bf071f/volumes" Apr 16 18:49:41.427087 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:41.427054 2566 generic.go:358] "Generic (PLEG): container finished" podID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerID="0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca" exitCode=0 Apr 16 18:49:41.427485 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:41.427132 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerDied","Data":"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca"} Apr 16 18:49:42.437977 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:42.437939 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerStarted","Data":"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8"} Apr 16 18:49:42.438456 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:42.438271 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:49:42.439568 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:42.439545 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:49:42.455428 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:42.455364 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podStartSLOduration=6.455349756 podStartE2EDuration="6.455349756s" podCreationTimestamp="2026-04-16 18:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:49:42.454730688 +0000 UTC m=+1958.547018654" watchObservedRunningTime="2026-04-16 18:49:42.455349756 +0000 UTC m=+1958.547637726" Apr 16 18:49:43.441285 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:43.441245 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:49:53.441444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:49:53.441359 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:03.441747 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:03.441701 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:13.441751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:13.441709 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:23.441931 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:23.441891 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:33.442262 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:33.442216 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:43.441741 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:43.441697 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:46.448597 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:46.448557 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:50:56.449140 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:50:56.449093 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.39:8080: connect: connection refused" Apr 16 18:51:06.451830 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:06.451803 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:51:07.949624 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:07.949585 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:51:07.950185 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:07.949941 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" containerID="cri-o://48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8" gracePeriod=30 Apr 16 18:51:08.051056 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051022 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:51:08.051395 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051383 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="storage-initializer" Apr 16 18:51:08.051440 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051396 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="storage-initializer" Apr 16 18:51:08.051440 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051410 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" Apr 16 18:51:08.051440 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051417 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" Apr 16 18:51:08.051537 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.051465 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bf1c71ad-f082-47af-ac96-c09b63bf071f" containerName="kserve-container" Apr 16 18:51:08.054406 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.054386 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:08.062579 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.062553 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:51:08.134944 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.134910 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location\") pod \"isvc-primary-0fc82b-predictor-77487ffb79-drqf6\" (UID: \"3130fe51-9bbc-4859-8e5d-6d33d40f4e57\") " pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:08.235444 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.235365 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location\") pod \"isvc-primary-0fc82b-predictor-77487ffb79-drqf6\" (UID: \"3130fe51-9bbc-4859-8e5d-6d33d40f4e57\") " pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:08.235758 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.235739 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location\") pod \"isvc-primary-0fc82b-predictor-77487ffb79-drqf6\" (UID: \"3130fe51-9bbc-4859-8e5d-6d33d40f4e57\") " pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:08.365477 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.365449 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:08.484123 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.484094 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:51:08.487071 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:51:08.486988 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3130fe51_9bbc_4859_8e5d_6d33d40f4e57.slice/crio-e9501c4e307676150ed73ae97e402d8d6b060ffd07ee4940b818251813dc9a4b WatchSource:0}: Error finding container e9501c4e307676150ed73ae97e402d8d6b060ffd07ee4940b818251813dc9a4b: Status 404 returned error can't find the container with id e9501c4e307676150ed73ae97e402d8d6b060ffd07ee4940b818251813dc9a4b Apr 16 18:51:08.689605 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.689570 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerStarted","Data":"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b"} Apr 16 18:51:08.689605 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:08.689607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerStarted","Data":"e9501c4e307676150ed73ae97e402d8d6b060ffd07ee4940b818251813dc9a4b"} Apr 16 18:51:11.490742 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.490719 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:51:11.565597 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.565526 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location\") pod \"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9\" (UID: \"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9\") " Apr 16 18:51:11.565812 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.565789 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" (UID: "5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:51:11.666582 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.666553 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:51:11.700047 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.700008 2566 generic.go:358] "Generic (PLEG): container finished" podID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerID="48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8" exitCode=0 Apr 16 18:51:11.700193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.700072 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" Apr 16 18:51:11.700193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.700083 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerDied","Data":"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8"} Apr 16 18:51:11.700193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.700127 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf" event={"ID":"5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9","Type":"ContainerDied","Data":"d4db7c89ce6900757953c5d72c6ad4ed3fc4cecf1ad73b3c018c51ab9e80cc7e"} Apr 16 18:51:11.700193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.700147 2566 scope.go:117] "RemoveContainer" containerID="48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8" Apr 16 18:51:11.708712 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.708695 2566 scope.go:117] "RemoveContainer" containerID="0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca" Apr 16 18:51:11.715850 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.715834 2566 scope.go:117] "RemoveContainer" containerID="48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8" Apr 16 18:51:11.716185 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:51:11.716163 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8\": container with ID starting with 48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8 not found: ID does not exist" containerID="48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8" Apr 16 18:51:11.716259 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.716193 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8"} err="failed to get container status \"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8\": rpc error: code = NotFound desc = could not find container \"48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8\": container with ID starting with 48c3fea882846ff13b18a3379db7d23f03fa9fd4f8c50906891541912dec48a8 not found: ID does not exist" Apr 16 18:51:11.716259 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.716213 2566 scope.go:117] "RemoveContainer" containerID="0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca" Apr 16 18:51:11.716496 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:51:11.716478 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca\": container with ID starting with 0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca not found: ID does not exist" containerID="0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca" Apr 16 18:51:11.716546 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.716502 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca"} err="failed to get container status \"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca\": rpc error: code = NotFound desc = could not find container \"0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca\": container with ID starting with 0963dc9bcc7c2b381ad46d529ce07b2dff0b333b60ee6bd24b9f3e5492ad6cca not found: ID does not exist" Apr 16 18:51:11.723636 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.723616 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:51:11.728680 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:11.728658 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-pmml-v2-kserve-predictor-75b87ff64c-g4nkf"] Apr 16 18:51:12.452284 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:12.452255 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" path="/var/lib/kubelet/pods/5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9/volumes" Apr 16 18:51:12.705580 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:12.705546 2566 generic.go:358] "Generic (PLEG): container finished" podID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerID="cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b" exitCode=0 Apr 16 18:51:12.706170 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:12.705611 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerDied","Data":"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b"} Apr 16 18:51:13.713273 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:13.713238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerStarted","Data":"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8"} Apr 16 18:51:13.713741 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:13.713524 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:51:13.714863 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:13.714808 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:51:13.733138 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:13.733095 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podStartSLOduration=5.733076945 podStartE2EDuration="5.733076945s" podCreationTimestamp="2026-04-16 18:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:51:13.730976903 +0000 UTC m=+2049.823264868" watchObservedRunningTime="2026-04-16 18:51:13.733076945 +0000 UTC m=+2049.825364909" Apr 16 18:51:14.716045 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:14.716009 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:51:24.716907 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:24.716816 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:51:34.716945 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:34.716897 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:51:44.716837 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:44.716798 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:51:54.716360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:51:54.716315 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:52:04.716566 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:04.716521 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:52:14.716819 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:14.716777 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.40:8080: connect: connection refused" Apr 16 18:52:24.717055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:24.717018 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:52:28.151432 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151378 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:28.151781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151747 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" Apr 16 18:52:28.151781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151759 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" Apr 16 18:52:28.151852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151793 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="storage-initializer" Apr 16 18:52:28.151852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151800 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="storage-initializer" Apr 16 18:52:28.151914 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.151853 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="5569bdf2-4286-4ae9-8ffe-5f4fafbaaee9" containerName="kserve-container" Apr 16 18:52:28.154732 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.154715 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.157096 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.157072 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-0fc82b\"" Apr 16 18:52:28.157223 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.157184 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 18:52:28.157286 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.157274 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-0fc82b-dockercfg-dmwvq\"" Apr 16 18:52:28.163274 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.163251 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:28.305239 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.305194 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.305413 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.305267 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.406577 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.406485 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.406577 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.406553 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.406920 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.406901 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.407114 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.407095 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert\") pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.465938 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.465898 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:28.581707 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.581676 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:28.584597 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:52:28.584568 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc94af2a8_462f_48fb_a1fb_8d01e1ef12b7.slice/crio-2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c WatchSource:0}: Error finding container 2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c: Status 404 returned error can't find the container with id 2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c Apr 16 18:52:28.586527 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.586500 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:52:28.932808 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.932773 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerStarted","Data":"38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699"} Apr 16 18:52:28.932808 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:28.932815 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerStarted","Data":"2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c"} Apr 16 18:52:34.954529 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:34.954503 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/0.log" Apr 16 18:52:34.954913 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:34.954543 2566 generic.go:358] "Generic (PLEG): container finished" podID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerID="38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699" exitCode=1 Apr 16 18:52:34.954913 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:34.954576 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerDied","Data":"38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699"} Apr 16 18:52:35.959481 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:35.959452 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/0.log" Apr 16 18:52:35.959865 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:35.959537 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerStarted","Data":"7ae9d44ee051a7f8026c7c7e0680c8239c6ad844e47d9fedfa6ab64c43827d42"} Apr 16 18:52:40.976934 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.976906 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/1.log" Apr 16 18:52:40.977348 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.977255 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/0.log" Apr 16 18:52:40.977348 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.977285 2566 generic.go:358] "Generic (PLEG): container finished" podID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerID="7ae9d44ee051a7f8026c7c7e0680c8239c6ad844e47d9fedfa6ab64c43827d42" exitCode=1 Apr 16 18:52:40.977348 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.977312 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerDied","Data":"7ae9d44ee051a7f8026c7c7e0680c8239c6ad844e47d9fedfa6ab64c43827d42"} Apr 16 18:52:40.977348 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.977338 2566 scope.go:117] "RemoveContainer" containerID="38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699" Apr 16 18:52:40.977751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:40.977728 2566 scope.go:117] "RemoveContainer" containerID="38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699" Apr 16 18:52:40.987855 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:40.987828 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_kserve-ci-e2e-test_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7_0 in pod sandbox 2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c from index: no such id: '38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699'" containerID="38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699" Apr 16 18:52:40.987920 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:40.987874 2566 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_kserve-ci-e2e-test_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7_0 in pod sandbox 2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c from index: no such id: '38ee89364f4bfe6e6ee4d821e36fe728d9118119ea9a81164f79ea8ce8328699'; Skipping pod \"isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_kserve-ci-e2e-test(c94af2a8-462f-48fb-a1fb-8d01e1ef12b7)\"" logger="UnhandledError" Apr 16 18:52:40.989216 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:40.989196 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_kserve-ci-e2e-test(c94af2a8-462f-48fb-a1fb-8d01e1ef12b7)\"" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" Apr 16 18:52:41.981983 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:41.981954 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/1.log" Apr 16 18:52:46.216228 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.216197 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:52:46.216634 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.216475 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" containerID="cri-o://df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8" gracePeriod=30 Apr 16 18:52:46.269191 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.269160 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:46.347185 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.347158 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:46.351848 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.351827 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.354190 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.354166 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-08836d\"" Apr 16 18:52:46.354280 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.354171 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-08836d-dockercfg-kmmf2\"" Apr 16 18:52:46.359351 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.359326 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:46.399289 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.399271 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/1.log" Apr 16 18:52:46.399381 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.399329 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:46.450167 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450123 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location\") pod \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " Apr 16 18:52:46.450351 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450195 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert\") pod \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\" (UID: \"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7\") " Apr 16 18:52:46.450415 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450397 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" (UID: "c94af2a8-462f-48fb-a1fb-8d01e1ef12b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:46.450462 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450430 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.450556 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450536 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" (UID: "c94af2a8-462f-48fb-a1fb-8d01e1ef12b7"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:52:46.450596 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450543 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.450644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450616 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:52:46.450644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.450629 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7-cabundle-cert\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:52:46.551506 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.551421 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.551506 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.551504 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.551820 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.551802 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.552106 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.552086 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert\") pod \"isvc-init-fail-08836d-predictor-745594c9d4-cxshn\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.663664 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.663620 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:46.788027 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.787984 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:46.791393 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:52:46.791360 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac0b0cd_03cb_43de_a9e1_4eeb0d339707.slice/crio-18c4969dffb0ae15a7c8930365349a97274d5bbcaaea65fdae814c981a3f9dfc WatchSource:0}: Error finding container 18c4969dffb0ae15a7c8930365349a97274d5bbcaaea65fdae814c981a3f9dfc: Status 404 returned error can't find the container with id 18c4969dffb0ae15a7c8930365349a97274d5bbcaaea65fdae814c981a3f9dfc Apr 16 18:52:46.999189 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.999159 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc_c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/storage-initializer/1.log" Apr 16 18:52:46.999360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.999278 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" Apr 16 18:52:46.999360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.999308 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc" event={"ID":"c94af2a8-462f-48fb-a1fb-8d01e1ef12b7","Type":"ContainerDied","Data":"2546c9f52fac6f239bd4bc9f596e50b11c41cfadf01cfb24a307d0c4d56ac91c"} Apr 16 18:52:46.999360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:46.999354 2566 scope.go:117] "RemoveContainer" containerID="7ae9d44ee051a7f8026c7c7e0680c8239c6ad844e47d9fedfa6ab64c43827d42" Apr 16 18:52:47.000840 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:47.000816 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerStarted","Data":"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79"} Apr 16 18:52:47.000950 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:47.000848 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerStarted","Data":"18c4969dffb0ae15a7c8930365349a97274d5bbcaaea65fdae814c981a3f9dfc"} Apr 16 18:52:47.046251 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:47.046224 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:47.054765 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:47.054737 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-0fc82b-predictor-8679bff96b-z7vqc"] Apr 16 18:52:48.451697 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:48.451663 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" path="/var/lib/kubelet/pods/c94af2a8-462f-48fb-a1fb-8d01e1ef12b7/volumes" Apr 16 18:52:50.013775 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.013745 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/0.log" Apr 16 18:52:50.014210 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.013788 2566 generic.go:358] "Generic (PLEG): container finished" podID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerID="13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79" exitCode=1 Apr 16 18:52:50.014210 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.013864 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerDied","Data":"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79"} Apr 16 18:52:50.466365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.466338 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:52:50.585175 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.585072 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location\") pod \"3130fe51-9bbc-4859-8e5d-6d33d40f4e57\" (UID: \"3130fe51-9bbc-4859-8e5d-6d33d40f4e57\") " Apr 16 18:52:50.585360 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.585340 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3130fe51-9bbc-4859-8e5d-6d33d40f4e57" (UID: "3130fe51-9bbc-4859-8e5d-6d33d40f4e57"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:50.685724 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:50.685678 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3130fe51-9bbc-4859-8e5d-6d33d40f4e57-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:52:51.019447 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.019414 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/0.log" Apr 16 18:52:51.019893 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.019530 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerStarted","Data":"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206"} Apr 16 18:52:51.020914 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.020893 2566 generic.go:358] "Generic (PLEG): container finished" podID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerID="df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8" exitCode=0 Apr 16 18:52:51.021064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.020926 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerDied","Data":"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8"} Apr 16 18:52:51.021064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.020946 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" event={"ID":"3130fe51-9bbc-4859-8e5d-6d33d40f4e57","Type":"ContainerDied","Data":"e9501c4e307676150ed73ae97e402d8d6b060ffd07ee4940b818251813dc9a4b"} Apr 16 18:52:51.021064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.020949 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6" Apr 16 18:52:51.021064 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.020960 2566 scope.go:117] "RemoveContainer" containerID="df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8" Apr 16 18:52:51.028897 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.028882 2566 scope.go:117] "RemoveContainer" containerID="cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b" Apr 16 18:52:51.035628 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.035613 2566 scope.go:117] "RemoveContainer" containerID="df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8" Apr 16 18:52:51.035863 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:51.035845 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8\": container with ID starting with df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8 not found: ID does not exist" containerID="df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8" Apr 16 18:52:51.035923 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.035870 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8"} err="failed to get container status \"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8\": rpc error: code = NotFound desc = could not find container \"df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8\": container with ID starting with df8e9f56f6b875290256f367410df8b2e4ea8c38061a8295ab750528fdbc36e8 not found: ID does not exist" Apr 16 18:52:51.035923 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.035888 2566 scope.go:117] "RemoveContainer" containerID="cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b" Apr 16 18:52:51.038806 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:51.036335 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b\": container with ID starting with cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b not found: ID does not exist" containerID="cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b" Apr 16 18:52:51.038806 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.036369 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b"} err="failed to get container status \"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b\": rpc error: code = NotFound desc = could not find container \"cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b\": container with ID starting with cd1208a0ab79c6edee5684231e5f90736cc652041ef9327947d81d1b79db918b not found: ID does not exist" Apr 16 18:52:51.052957 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.052936 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:52:51.057763 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.057743 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-0fc82b-predictor-77487ffb79-drqf6"] Apr 16 18:52:51.393547 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.393515 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:51.514855 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.514821 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:52:51.515195 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515182 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515198 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515212 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515217 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515226 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515235 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515249 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515255 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515306 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.515372 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515316 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" containerName="kserve-container" Apr 16 18:52:51.515636 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.515420 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c94af2a8-462f-48fb-a1fb-8d01e1ef12b7" containerName="storage-initializer" Apr 16 18:52:51.519684 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.519665 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:52:51.521978 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.521961 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdsxq\"" Apr 16 18:52:51.528369 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.528350 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:52:51.593956 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.593932 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8gzps\" (UID: \"c9d10793-cc4b-41df-8478-5884f3f072d3\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:52:51.694644 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.694559 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8gzps\" (UID: \"c9d10793-cc4b-41df-8478-5884f3f072d3\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:52:51.694892 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.694877 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location\") pod \"isvc-predictive-sklearn-predictor-85bccb8945-8gzps\" (UID: \"c9d10793-cc4b-41df-8478-5884f3f072d3\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:52:51.829751 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.829723 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:52:51.955086 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:51.955063 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:52:51.957213 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:52:51.957185 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9d10793_cc4b_41df_8478_5884f3f072d3.slice/crio-bc49f4274b767d348b231d22c8fc944790929277c378288540dc0aad9eb1ad4c WatchSource:0}: Error finding container bc49f4274b767d348b231d22c8fc944790929277c378288540dc0aad9eb1ad4c: Status 404 returned error can't find the container with id bc49f4274b767d348b231d22c8fc944790929277c378288540dc0aad9eb1ad4c Apr 16 18:52:52.026232 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:52.026196 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerStarted","Data":"bc49f4274b767d348b231d22c8fc944790929277c378288540dc0aad9eb1ad4c"} Apr 16 18:52:52.027236 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:52.027202 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" containerID="cri-o://c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206" gracePeriod=30 Apr 16 18:52:52.452859 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:52.452825 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3130fe51-9bbc-4859-8e5d-6d33d40f4e57" path="/var/lib/kubelet/pods/3130fe51-9bbc-4859-8e5d-6d33d40f4e57/volumes" Apr 16 18:52:53.032848 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:53.032810 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerStarted","Data":"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168"} Apr 16 18:52:54.770229 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.770208 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/1.log" Apr 16 18:52:54.770586 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.770546 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/0.log" Apr 16 18:52:54.770647 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.770622 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:54.821185 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.821161 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert\") pod \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " Apr 16 18:52:54.821346 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.821216 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location\") pod \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\" (UID: \"fac0b0cd-03cb-43de-a9e1-4eeb0d339707\") " Apr 16 18:52:54.821501 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.821475 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fac0b0cd-03cb-43de-a9e1-4eeb0d339707" (UID: "fac0b0cd-03cb-43de-a9e1-4eeb0d339707"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:52:54.821550 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.821534 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "fac0b0cd-03cb-43de-a9e1-4eeb0d339707" (UID: "fac0b0cd-03cb-43de-a9e1-4eeb0d339707"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 18:52:54.922068 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.921987 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-cabundle-cert\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:52:54.922068 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:54.922030 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fac0b0cd-03cb-43de-a9e1-4eeb0d339707-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:52:55.039832 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.039804 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/1.log" Apr 16 18:52:55.040157 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040140 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-08836d-predictor-745594c9d4-cxshn_fac0b0cd-03cb-43de-a9e1-4eeb0d339707/storage-initializer/0.log" Apr 16 18:52:55.040217 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040176 2566 generic.go:358] "Generic (PLEG): container finished" podID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerID="c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206" exitCode=1 Apr 16 18:52:55.040254 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040210 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerDied","Data":"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206"} Apr 16 18:52:55.040254 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040246 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" Apr 16 18:52:55.040323 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040259 2566 scope.go:117] "RemoveContainer" containerID="c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206" Apr 16 18:52:55.040368 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.040248 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn" event={"ID":"fac0b0cd-03cb-43de-a9e1-4eeb0d339707","Type":"ContainerDied","Data":"18c4969dffb0ae15a7c8930365349a97274d5bbcaaea65fdae814c981a3f9dfc"} Apr 16 18:52:55.049506 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.049493 2566 scope.go:117] "RemoveContainer" containerID="13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79" Apr 16 18:52:55.058823 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.058803 2566 scope.go:117] "RemoveContainer" containerID="c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206" Apr 16 18:52:55.059191 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:55.059163 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206\": container with ID starting with c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206 not found: ID does not exist" containerID="c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206" Apr 16 18:52:55.059291 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.059196 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206"} err="failed to get container status \"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206\": rpc error: code = NotFound desc = could not find container \"c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206\": container with ID starting with c91b09e94c5b97e629cf9f74562653527b43845853636d3ebe51f0499517d206 not found: ID does not exist" Apr 16 18:52:55.059291 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.059220 2566 scope.go:117] "RemoveContainer" containerID="13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79" Apr 16 18:52:55.059486 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:52:55.059473 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79\": container with ID starting with 13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79 not found: ID does not exist" containerID="13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79" Apr 16 18:52:55.059534 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.059489 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79"} err="failed to get container status \"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79\": rpc error: code = NotFound desc = could not find container \"13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79\": container with ID starting with 13a289c4c365dfd44d5b6b827dc194fb4d44238bf63f2fc2e4cbd0529f869c79 not found: ID does not exist" Apr 16 18:52:55.082450 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.082426 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:55.088141 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:55.088120 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-08836d-predictor-745594c9d4-cxshn"] Apr 16 18:52:56.045385 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:56.045355 2566 generic.go:358] "Generic (PLEG): container finished" podID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerID="e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168" exitCode=0 Apr 16 18:52:56.045845 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:56.045440 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerDied","Data":"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168"} Apr 16 18:52:56.451730 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:52:56.451696 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" path="/var/lib/kubelet/pods/fac0b0cd-03cb-43de-a9e1-4eeb0d339707/volumes" Apr 16 18:53:18.119914 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:18.119881 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerStarted","Data":"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f"} Apr 16 18:53:18.120256 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:18.120187 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:53:18.121422 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:18.121396 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:53:18.142713 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:18.142672 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podStartSLOduration=5.141099628 podStartE2EDuration="27.142659852s" podCreationTimestamp="2026-04-16 18:52:51 +0000 UTC" firstStartedPulling="2026-04-16 18:52:56.046666878 +0000 UTC m=+2152.138954821" lastFinishedPulling="2026-04-16 18:53:18.048227101 +0000 UTC m=+2174.140515045" observedRunningTime="2026-04-16 18:53:18.140615135 +0000 UTC m=+2174.232903122" watchObservedRunningTime="2026-04-16 18:53:18.142659852 +0000 UTC m=+2174.234947817" Apr 16 18:53:19.123193 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:19.123152 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:53:29.124028 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:29.123974 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:53:39.124055 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:39.123987 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:53:49.124072 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:49.124026 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:53:59.123948 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:53:59.123902 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:54:09.123298 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:09.123257 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:54:19.124015 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:19.123960 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:54:29.123589 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:29.123501 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.43:8080: connect: connection refused" Apr 16 18:54:39.123778 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:39.123747 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:54:41.919578 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:41.919546 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:54:41.920063 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:41.919869 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" containerID="cri-o://f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f" gracePeriod=30 Apr 16 18:54:42.184587 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184515 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:54:42.184847 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184836 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.184899 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184849 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.184936 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184918 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.184936 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184929 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.185027 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184973 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.185027 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.184980 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fac0b0cd-03cb-43de-a9e1-4eeb0d339707" containerName="storage-initializer" Apr 16 18:54:42.187891 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.187871 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:42.214601 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.214572 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj\" (UID: \"a462062d-0378-406b-a95c-1a63a7171482\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:42.226054 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.223844 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:54:42.315378 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.315341 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj\" (UID: \"a462062d-0378-406b-a95c-1a63a7171482\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:42.315732 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.315710 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location\") pod \"isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj\" (UID: \"a462062d-0378-406b-a95c-1a63a7171482\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:42.498803 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.498715 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:42.634078 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:42.634053 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:54:42.635408 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:54:42.635379 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda462062d_0378_406b_a95c_1a63a7171482.slice/crio-8cc7af55c9dbabf9e64207b62266d2172d0e732d31335de62ac2d751b722221a WatchSource:0}: Error finding container 8cc7af55c9dbabf9e64207b62266d2172d0e732d31335de62ac2d751b722221a: Status 404 returned error can't find the container with id 8cc7af55c9dbabf9e64207b62266d2172d0e732d31335de62ac2d751b722221a Apr 16 18:54:43.375280 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:43.375244 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerStarted","Data":"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2"} Apr 16 18:54:43.375280 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:43.375282 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerStarted","Data":"8cc7af55c9dbabf9e64207b62266d2172d0e732d31335de62ac2d751b722221a"} Apr 16 18:54:46.585346 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:46.585324 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:54:46.651869 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:46.651790 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location\") pod \"c9d10793-cc4b-41df-8478-5884f3f072d3\" (UID: \"c9d10793-cc4b-41df-8478-5884f3f072d3\") " Apr 16 18:54:46.652160 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:46.652130 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c9d10793-cc4b-41df-8478-5884f3f072d3" (UID: "c9d10793-cc4b-41df-8478-5884f3f072d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:54:46.753409 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:46.753376 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c9d10793-cc4b-41df-8478-5884f3f072d3-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:54:47.388887 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.388851 2566 generic.go:358] "Generic (PLEG): container finished" podID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerID="f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f" exitCode=0 Apr 16 18:54:47.389205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.388918 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" Apr 16 18:54:47.389205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.388942 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerDied","Data":"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f"} Apr 16 18:54:47.389205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.388988 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps" event={"ID":"c9d10793-cc4b-41df-8478-5884f3f072d3","Type":"ContainerDied","Data":"bc49f4274b767d348b231d22c8fc944790929277c378288540dc0aad9eb1ad4c"} Apr 16 18:54:47.389205 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.389026 2566 scope.go:117] "RemoveContainer" containerID="f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f" Apr 16 18:54:47.390259 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.390241 2566 generic.go:358] "Generic (PLEG): container finished" podID="a462062d-0378-406b-a95c-1a63a7171482" containerID="32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2" exitCode=0 Apr 16 18:54:47.390341 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.390289 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerDied","Data":"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2"} Apr 16 18:54:47.403481 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.403453 2566 scope.go:117] "RemoveContainer" containerID="e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168" Apr 16 18:54:47.410754 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.410732 2566 scope.go:117] "RemoveContainer" containerID="f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f" Apr 16 18:54:47.411040 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:54:47.411019 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f\": container with ID starting with f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f not found: ID does not exist" containerID="f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f" Apr 16 18:54:47.411128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.411047 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f"} err="failed to get container status \"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f\": rpc error: code = NotFound desc = could not find container \"f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f\": container with ID starting with f8df0fcca103aeb1bd6b361947d482616bd97cb32c2d9e043a83f4fadda5442f not found: ID does not exist" Apr 16 18:54:47.411128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.411063 2566 scope.go:117] "RemoveContainer" containerID="e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168" Apr 16 18:54:47.411281 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:54:47.411262 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168\": container with ID starting with e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168 not found: ID does not exist" containerID="e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168" Apr 16 18:54:47.411342 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.411285 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168"} err="failed to get container status \"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168\": rpc error: code = NotFound desc = could not find container \"e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168\": container with ID starting with e3d61aef4474acb3a96f373c1afd68a37bf424242520b86883f20cd4f4f68168 not found: ID does not exist" Apr 16 18:54:47.455392 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.455357 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:54:47.464540 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:47.464518 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-predictor-85bccb8945-8gzps"] Apr 16 18:54:48.395559 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:48.395527 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerStarted","Data":"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11"} Apr 16 18:54:48.395949 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:48.395814 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:54:48.397130 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:48.397104 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:54:48.431076 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:48.431026 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podStartSLOduration=6.431011561 podStartE2EDuration="6.431011561s" podCreationTimestamp="2026-04-16 18:54:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:54:48.430208166 +0000 UTC m=+2264.522496130" watchObservedRunningTime="2026-04-16 18:54:48.431011561 +0000 UTC m=+2264.523299520" Apr 16 18:54:48.452545 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:48.452514 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" path="/var/lib/kubelet/pods/c9d10793-cc4b-41df-8478-5884f3f072d3/volumes" Apr 16 18:54:49.398679 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:49.398644 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:54:59.399252 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:54:59.399202 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:09.398781 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:09.398736 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:19.398902 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:19.398855 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:29.399277 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:29.399232 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:39.399070 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:39.399033 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:49.398791 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:49.398747 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:55:55.448975 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:55:55.448880 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:56:05.450128 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:05.450094 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:56:12.082271 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.082238 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:56:12.082672 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.082485 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" containerID="cri-o://c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11" gracePeriod=30 Apr 16 18:56:12.154108 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154073 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:56:12.154415 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154403 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="storage-initializer" Apr 16 18:56:12.154466 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154416 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="storage-initializer" Apr 16 18:56:12.154466 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154432 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" Apr 16 18:56:12.154466 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154438 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" Apr 16 18:56:12.154573 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.154488 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9d10793-cc4b-41df-8478-5884f3f072d3" containerName="kserve-container" Apr 16 18:56:12.157398 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.157380 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:12.166499 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.166475 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:56:12.178528 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.178500 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4jq7g\" (UID: \"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:12.279928 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.279898 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4jq7g\" (UID: \"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:12.280299 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.280281 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location\") pod \"isvc-predictive-lightgbm-predictor-669896799c-4jq7g\" (UID: \"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:12.468174 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.468146 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:12.587770 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.587725 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:56:12.590787 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:56:12.590757 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98f0b27b_fac9_4c1f_b8f2_0a122c091f7a.slice/crio-b37050ff408f65d97f82c31c0f0b19392864b3df06f6c1db1415e9277c9f768b WatchSource:0}: Error finding container b37050ff408f65d97f82c31c0f0b19392864b3df06f6c1db1415e9277c9f768b: Status 404 returned error can't find the container with id b37050ff408f65d97f82c31c0f0b19392864b3df06f6c1db1415e9277c9f768b Apr 16 18:56:12.645099 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:12.645072 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerStarted","Data":"b37050ff408f65d97f82c31c0f0b19392864b3df06f6c1db1415e9277c9f768b"} Apr 16 18:56:13.649207 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:13.649170 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerStarted","Data":"fe40714484a3cb5cd7a0bbe5f51194e23d7b4d79291f9cb4b0ff92d5a3f5d4f5"} Apr 16 18:56:15.449012 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:15.448953 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.44:8080: connect: connection refused" Apr 16 18:56:16.620732 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.620709 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:56:16.662885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.662857 2566 generic.go:358] "Generic (PLEG): container finished" podID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerID="fe40714484a3cb5cd7a0bbe5f51194e23d7b4d79291f9cb4b0ff92d5a3f5d4f5" exitCode=0 Apr 16 18:56:16.663046 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.662930 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerDied","Data":"fe40714484a3cb5cd7a0bbe5f51194e23d7b4d79291f9cb4b0ff92d5a3f5d4f5"} Apr 16 18:56:16.664292 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.664266 2566 generic.go:358] "Generic (PLEG): container finished" podID="a462062d-0378-406b-a95c-1a63a7171482" containerID="c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11" exitCode=0 Apr 16 18:56:16.664433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.664307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerDied","Data":"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11"} Apr 16 18:56:16.664433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.664328 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" event={"ID":"a462062d-0378-406b-a95c-1a63a7171482","Type":"ContainerDied","Data":"8cc7af55c9dbabf9e64207b62266d2172d0e732d31335de62ac2d751b722221a"} Apr 16 18:56:16.664433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.664330 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj" Apr 16 18:56:16.664433 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.664349 2566 scope.go:117] "RemoveContainer" containerID="c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11" Apr 16 18:56:16.671656 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.671630 2566 scope.go:117] "RemoveContainer" containerID="32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2" Apr 16 18:56:16.678938 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.678913 2566 scope.go:117] "RemoveContainer" containerID="c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11" Apr 16 18:56:16.679277 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:56:16.679247 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11\": container with ID starting with c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11 not found: ID does not exist" containerID="c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11" Apr 16 18:56:16.679355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.679290 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11"} err="failed to get container status \"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11\": rpc error: code = NotFound desc = could not find container \"c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11\": container with ID starting with c89e1aa5227933a0f0c99d4ad11e8b7dfeb485777d79648d123ac0ea1b997a11 not found: ID does not exist" Apr 16 18:56:16.679355 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.679317 2566 scope.go:117] "RemoveContainer" containerID="32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2" Apr 16 18:56:16.679621 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:56:16.679599 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2\": container with ID starting with 32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2 not found: ID does not exist" containerID="32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2" Apr 16 18:56:16.679752 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.679629 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2"} err="failed to get container status \"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2\": rpc error: code = NotFound desc = could not find container \"32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2\": container with ID starting with 32a11e36993e3b479ee672db6be3fd3c3fc9c0673cae4bdb1be1f502378456a2 not found: ID does not exist" Apr 16 18:56:16.712028 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.712009 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location\") pod \"a462062d-0378-406b-a95c-1a63a7171482\" (UID: \"a462062d-0378-406b-a95c-1a63a7171482\") " Apr 16 18:56:16.712293 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.712264 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a462062d-0378-406b-a95c-1a63a7171482" (UID: "a462062d-0378-406b-a95c-1a63a7171482"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:56:16.813302 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.813223 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a462062d-0378-406b-a95c-1a63a7171482-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:56:16.988789 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.988759 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:56:16.993790 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:16.993767 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-predictor-794b54b9b4-pqcqj"] Apr 16 18:56:17.668358 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:17.668322 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerStarted","Data":"5c6b787bc43990b891823d66c58ff74735c5c7af37e5867f871526dcefe7503c"} Apr 16 18:56:17.668779 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:17.668688 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:56:17.669883 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:17.669857 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:56:17.689489 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:17.689444 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podStartSLOduration=5.689430457 podStartE2EDuration="5.689430457s" podCreationTimestamp="2026-04-16 18:56:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:56:17.687772581 +0000 UTC m=+2353.780060547" watchObservedRunningTime="2026-04-16 18:56:17.689430457 +0000 UTC m=+2353.781718422" Apr 16 18:56:18.452397 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:18.452362 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a462062d-0378-406b-a95c-1a63a7171482" path="/var/lib/kubelet/pods/a462062d-0378-406b-a95c-1a63a7171482/volumes" Apr 16 18:56:18.672162 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:18.672127 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:56:28.672140 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:28.672098 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:56:38.672796 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:38.672754 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:56:48.672852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:48.672810 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:56:58.672769 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:56:58.672727 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:57:08.672309 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:08.672261 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:57:18.673106 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:18.673066 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.45:8080: connect: connection refused" Apr 16 18:57:28.673924 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:28.673846 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:57:32.311486 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.311451 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:57:32.311845 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.311720 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" containerID="cri-o://5c6b787bc43990b891823d66c58ff74735c5c7af37e5867f871526dcefe7503c" gracePeriod=30 Apr 16 18:57:32.364305 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364273 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:57:32.364610 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364598 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" Apr 16 18:57:32.364664 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364611 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" Apr 16 18:57:32.364664 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364629 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="storage-initializer" Apr 16 18:57:32.364664 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364635 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="storage-initializer" Apr 16 18:57:32.364769 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.364687 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a462062d-0378-406b-a95c-1a63a7171482" containerName="kserve-container" Apr 16 18:57:32.367737 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.367719 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:32.375980 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.375956 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:57:32.431975 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.431940 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd\" (UID: \"dcd15964-7680-4a2c-97e0-1a12b028640f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:32.532892 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.532866 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd\" (UID: \"dcd15964-7680-4a2c-97e0-1a12b028640f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:32.533216 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.533198 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location\") pod \"isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd\" (UID: \"dcd15964-7680-4a2c-97e0-1a12b028640f\") " pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:32.678825 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.678798 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:32.795723 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.795694 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:57:32.798182 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:57:32.798147 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcd15964_7680_4a2c_97e0_1a12b028640f.slice/crio-c0e3bb39cf6e2ac45c03463f3d9720a0c75f145a93c24a242a8464bb3350d2d4 WatchSource:0}: Error finding container c0e3bb39cf6e2ac45c03463f3d9720a0c75f145a93c24a242a8464bb3350d2d4: Status 404 returned error can't find the container with id c0e3bb39cf6e2ac45c03463f3d9720a0c75f145a93c24a242a8464bb3350d2d4 Apr 16 18:57:32.800039 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.800020 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 18:57:32.898159 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.898125 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerStarted","Data":"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5"} Apr 16 18:57:32.898307 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:32.898168 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerStarted","Data":"c0e3bb39cf6e2ac45c03463f3d9720a0c75f145a93c24a242a8464bb3350d2d4"} Apr 16 18:57:36.911428 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:36.911356 2566 generic.go:358] "Generic (PLEG): container finished" podID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerID="5c6b787bc43990b891823d66c58ff74735c5c7af37e5867f871526dcefe7503c" exitCode=0 Apr 16 18:57:36.911813 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:36.911430 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerDied","Data":"5c6b787bc43990b891823d66c58ff74735c5c7af37e5867f871526dcefe7503c"} Apr 16 18:57:36.912612 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:36.912592 2566 generic.go:358] "Generic (PLEG): container finished" podID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerID="44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5" exitCode=0 Apr 16 18:57:36.912664 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:36.912625 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerDied","Data":"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5"} Apr 16 18:57:37.363888 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.363858 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:57:37.475247 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.475168 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location\") pod \"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a\" (UID: \"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a\") " Apr 16 18:57:37.475489 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.475465 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" (UID: "98f0b27b-fac9-4c1f-b8f2-0a122c091f7a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:57:37.576582 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.576550 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:57:37.916971 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.916934 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" event={"ID":"98f0b27b-fac9-4c1f-b8f2-0a122c091f7a","Type":"ContainerDied","Data":"b37050ff408f65d97f82c31c0f0b19392864b3df06f6c1db1415e9277c9f768b"} Apr 16 18:57:37.916971 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.916960 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g" Apr 16 18:57:37.917501 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.916985 2566 scope.go:117] "RemoveContainer" containerID="5c6b787bc43990b891823d66c58ff74735c5c7af37e5867f871526dcefe7503c" Apr 16 18:57:37.918841 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.918817 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerStarted","Data":"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6"} Apr 16 18:57:37.919054 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.919037 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:57:37.924910 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.924889 2566 scope.go:117] "RemoveContainer" containerID="fe40714484a3cb5cd7a0bbe5f51194e23d7b4d79291f9cb4b0ff92d5a3f5d4f5" Apr 16 18:57:37.942227 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.942187 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podStartSLOduration=5.942165161 podStartE2EDuration="5.942165161s" podCreationTimestamp="2026-04-16 18:57:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:57:37.939959543 +0000 UTC m=+2434.032247508" watchObservedRunningTime="2026-04-16 18:57:37.942165161 +0000 UTC m=+2434.034453125" Apr 16 18:57:37.954023 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.953983 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:57:37.956682 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:37.956660 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-predictor-669896799c-4jq7g"] Apr 16 18:57:38.451968 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:57:38.451933 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" path="/var/lib/kubelet/pods/98f0b27b-fac9-4c1f-b8f2-0a122c091f7a/volumes" Apr 16 18:58:08.924047 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:08.923985 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:58:18.922730 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:18.922687 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:58:28.923631 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:28.923586 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:58:38.923416 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:38.923371 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:58:48.923365 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:48.923322 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:58:54.454049 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:58:54.453948 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:59:02.535320 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.535286 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:59:02.535677 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.535547 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" containerID="cri-o://2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6" gracePeriod=30 Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.597983 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.598802 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.598824 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.598858 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="storage-initializer" Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.598869 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="storage-initializer" Apr 16 18:59:02.601032 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.599060 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="98f0b27b-fac9-4c1f-b8f2-0a122c091f7a" containerName="kserve-container" Apr 16 18:59:02.603981 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.603279 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:02.612573 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.612186 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 18:59:02.686885 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.686846 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn\" (UID: \"452713fe-7f9f-44bb-9af3-f22f1681bb54\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:02.787705 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.787618 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn\" (UID: \"452713fe-7f9f-44bb-9af3-f22f1681bb54\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:02.787987 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.787967 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location\") pod \"isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn\" (UID: \"452713fe-7f9f-44bb-9af3-f22f1681bb54\") " pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:02.916870 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:02.916835 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:03.035501 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:03.035474 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 18:59:03.038252 ip-10-0-136-226 kubenswrapper[2566]: W0416 18:59:03.038194 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod452713fe_7f9f_44bb_9af3_f22f1681bb54.slice/crio-403874a9ca957b1109389378969be42ee032d873a97addad499d367c02d59b34 WatchSource:0}: Error finding container 403874a9ca957b1109389378969be42ee032d873a97addad499d367c02d59b34: Status 404 returned error can't find the container with id 403874a9ca957b1109389378969be42ee032d873a97addad499d367c02d59b34 Apr 16 18:59:03.181101 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:03.181062 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerStarted","Data":"9c1820a4892e0bf4c2399d6a1bb5e943df811ecadbe2509499116b3d1e67d6bb"} Apr 16 18:59:03.181101 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:03.181101 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerStarted","Data":"403874a9ca957b1109389378969be42ee032d873a97addad499d367c02d59b34"} Apr 16 18:59:04.450535 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:04.450493 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.46:8080/v2/models/isvc-predictive-sklearn-v2/ready\": dial tcp 10.132.0.46:8080: connect: connection refused" Apr 16 18:59:07.194912 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.194885 2566 generic.go:358] "Generic (PLEG): container finished" podID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerID="9c1820a4892e0bf4c2399d6a1bb5e943df811ecadbe2509499116b3d1e67d6bb" exitCode=0 Apr 16 18:59:07.195219 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.194941 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerDied","Data":"9c1820a4892e0bf4c2399d6a1bb5e943df811ecadbe2509499116b3d1e67d6bb"} Apr 16 18:59:07.380960 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.380929 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:59:07.427666 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.427584 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location\") pod \"dcd15964-7680-4a2c-97e0-1a12b028640f\" (UID: \"dcd15964-7680-4a2c-97e0-1a12b028640f\") " Apr 16 18:59:07.427906 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.427882 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dcd15964-7680-4a2c-97e0-1a12b028640f" (UID: "dcd15964-7680-4a2c-97e0-1a12b028640f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 18:59:07.528378 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:07.528351 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcd15964-7680-4a2c-97e0-1a12b028640f-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 18:59:08.198791 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.198760 2566 generic.go:358] "Generic (PLEG): container finished" podID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerID="2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6" exitCode=0 Apr 16 18:59:08.199246 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.198825 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" Apr 16 18:59:08.199246 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.198832 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerDied","Data":"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6"} Apr 16 18:59:08.199246 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.198863 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd" event={"ID":"dcd15964-7680-4a2c-97e0-1a12b028640f","Type":"ContainerDied","Data":"c0e3bb39cf6e2ac45c03463f3d9720a0c75f145a93c24a242a8464bb3350d2d4"} Apr 16 18:59:08.199246 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.198883 2566 scope.go:117] "RemoveContainer" containerID="2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6" Apr 16 18:59:08.200801 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.200772 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerStarted","Data":"6bb0d04b8f636b9b9181e6a8a1f5cf83438c0566b5225845c7ce8e6d9bf9b679"} Apr 16 18:59:08.201016 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.200981 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 18:59:08.207783 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.207764 2566 scope.go:117] "RemoveContainer" containerID="44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5" Apr 16 18:59:08.214552 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.214536 2566 scope.go:117] "RemoveContainer" containerID="2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6" Apr 16 18:59:08.214804 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:59:08.214783 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6\": container with ID starting with 2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6 not found: ID does not exist" containerID="2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6" Apr 16 18:59:08.214852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.214813 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6"} err="failed to get container status \"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6\": rpc error: code = NotFound desc = could not find container \"2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6\": container with ID starting with 2584800d60fffb0f1a1789d2710f390def5b4ecde0031f877e6cffb9bcbc51a6 not found: ID does not exist" Apr 16 18:59:08.214852 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.214831 2566 scope.go:117] "RemoveContainer" containerID="44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5" Apr 16 18:59:08.215156 ip-10-0-136-226 kubenswrapper[2566]: E0416 18:59:08.215137 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5\": container with ID starting with 44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5 not found: ID does not exist" containerID="44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5" Apr 16 18:59:08.215215 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.215166 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5"} err="failed to get container status \"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5\": rpc error: code = NotFound desc = could not find container \"44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5\": container with ID starting with 44613ce358cb347c0d516455b7f7dc0c96cad699a77174afd39c36e88f76bdb5 not found: ID does not exist" Apr 16 18:59:08.237944 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.237885 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podStartSLOduration=6.237871477 podStartE2EDuration="6.237871477s" podCreationTimestamp="2026-04-16 18:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 18:59:08.237132088 +0000 UTC m=+2524.329420053" watchObservedRunningTime="2026-04-16 18:59:08.237871477 +0000 UTC m=+2524.330159448" Apr 16 18:59:08.255501 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.255480 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:59:08.257672 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.257652 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-sklearn-v2-predictor-6fd9c49f4f-m94vd"] Apr 16 18:59:08.451912 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:08.451834 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" path="/var/lib/kubelet/pods/dcd15964-7680-4a2c-97e0-1a12b028640f/volumes" Apr 16 18:59:39.206854 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:39.206810 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:59:49.204804 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:49.204758 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 18:59:59.205299 ip-10-0-136-226 kubenswrapper[2566]: I0416 18:59:59.205251 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 19:00:09.205769 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:09.205726 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.47:8080/v2/models/isvc-predictive-xgboost-v2/ready\": dial tcp 10.132.0.47:8080: connect: connection refused" Apr 16 19:00:19.208593 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:19.208551 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 19:00:22.723203 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.723167 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 19:00:22.723742 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.723504 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" containerID="cri-o://6bb0d04b8f636b9b9181e6a8a1f5cf83438c0566b5225845c7ce8e6d9bf9b679" gracePeriod=30 Apr 16 19:00:22.784623 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.784590 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:00:22.785029 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.784984 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="storage-initializer" Apr 16 19:00:22.785133 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.785031 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="storage-initializer" Apr 16 19:00:22.785133 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.785067 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" Apr 16 19:00:22.785133 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.785076 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" Apr 16 19:00:22.785308 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.785160 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="dcd15964-7680-4a2c-97e0-1a12b028640f" containerName="kserve-container" Apr 16 19:00:22.788306 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.788280 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:22.796975 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.796952 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:00:22.943352 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:22.943313 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv\" (UID: \"fc7c89e9-6c91-4dd9-8a49-9f170a750b60\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:23.044699 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.044606 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv\" (UID: \"fc7c89e9-6c91-4dd9-8a49-9f170a750b60\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:23.044962 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.044944 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location\") pod \"isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv\" (UID: \"fc7c89e9-6c91-4dd9-8a49-9f170a750b60\") " pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:23.098652 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.098624 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:23.219528 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.219504 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:00:23.222215 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:00:23.222184 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc7c89e9_6c91_4dd9_8a49_9f170a750b60.slice/crio-2f7b9d394b0e78821066f0837ff19abf6b2a35a7507e01a7c6ef465177e9e92b WatchSource:0}: Error finding container 2f7b9d394b0e78821066f0837ff19abf6b2a35a7507e01a7c6ef465177e9e92b: Status 404 returned error can't find the container with id 2f7b9d394b0e78821066f0837ff19abf6b2a35a7507e01a7c6ef465177e9e92b Apr 16 19:00:23.429886 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.429806 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerStarted","Data":"c7e61e24fe2cb6b7a4a03396fcdc299525aad5af13b5962e12ea9bf839345833"} Apr 16 19:00:23.429886 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:23.429843 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerStarted","Data":"2f7b9d394b0e78821066f0837ff19abf6b2a35a7507e01a7c6ef465177e9e92b"} Apr 16 19:00:27.443548 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.443514 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerID="c7e61e24fe2cb6b7a4a03396fcdc299525aad5af13b5962e12ea9bf839345833" exitCode=0 Apr 16 19:00:27.443970 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.443584 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerDied","Data":"c7e61e24fe2cb6b7a4a03396fcdc299525aad5af13b5962e12ea9bf839345833"} Apr 16 19:00:27.445356 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.445329 2566 generic.go:358] "Generic (PLEG): container finished" podID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerID="6bb0d04b8f636b9b9181e6a8a1f5cf83438c0566b5225845c7ce8e6d9bf9b679" exitCode=0 Apr 16 19:00:27.445502 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.445414 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerDied","Data":"6bb0d04b8f636b9b9181e6a8a1f5cf83438c0566b5225845c7ce8e6d9bf9b679"} Apr 16 19:00:27.470162 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.470145 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 19:00:27.584377 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.584344 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location\") pod \"452713fe-7f9f-44bb-9af3-f22f1681bb54\" (UID: \"452713fe-7f9f-44bb-9af3-f22f1681bb54\") " Apr 16 19:00:27.584705 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.584679 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "452713fe-7f9f-44bb-9af3-f22f1681bb54" (UID: "452713fe-7f9f-44bb-9af3-f22f1681bb54"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:00:27.685459 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:27.685382 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/452713fe-7f9f-44bb-9af3-f22f1681bb54-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:00:28.451728 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.451700 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" Apr 16 19:00:28.453261 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.453231 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerStarted","Data":"d98ed50ec9d005048baba4a323a25f19f8f82baf3fae814199012543ed0a085d"} Apr 16 19:00:28.453261 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.453261 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn" event={"ID":"452713fe-7f9f-44bb-9af3-f22f1681bb54","Type":"ContainerDied","Data":"403874a9ca957b1109389378969be42ee032d873a97addad499d367c02d59b34"} Apr 16 19:00:28.453458 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.453286 2566 scope.go:117] "RemoveContainer" containerID="6bb0d04b8f636b9b9181e6a8a1f5cf83438c0566b5225845c7ce8e6d9bf9b679" Apr 16 19:00:28.453516 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.453503 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:00:28.462158 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.462138 2566 scope.go:117] "RemoveContainer" containerID="9c1820a4892e0bf4c2399d6a1bb5e943df811ecadbe2509499116b3d1e67d6bb" Apr 16 19:00:28.475025 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.474971 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podStartSLOduration=6.474958361 podStartE2EDuration="6.474958361s" podCreationTimestamp="2026-04-16 19:00:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:00:28.472855277 +0000 UTC m=+2604.565143242" watchObservedRunningTime="2026-04-16 19:00:28.474958361 +0000 UTC m=+2604.567246326" Apr 16 19:00:28.486840 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.486813 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 19:00:28.491482 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:28.491460 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-xgboost-v2-predictor-67fd65d6cb-2xmmn"] Apr 16 19:00:30.451959 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:30.451927 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" path="/var/lib/kubelet/pods/452713fe-7f9f-44bb-9af3-f22f1681bb54/volumes" Apr 16 19:00:59.457676 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:00:59.457633 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:09.456051 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:09.456004 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:19.456561 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:19.456510 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:29.456181 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:29.456129 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:39.456031 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:39.455955 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:46.452452 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:46.452420 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:01:52.937807 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:52.937767 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:01:52.938280 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:52.938053 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" containerID="cri-o://d98ed50ec9d005048baba4a323a25f19f8f82baf3fae814199012543ed0a085d" gracePeriod=30 Apr 16 19:01:55.249855 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.249782 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:01:55.250254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.250113 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="storage-initializer" Apr 16 19:01:55.250254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.250124 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="storage-initializer" Apr 16 19:01:55.250254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.250140 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" Apr 16 19:01:55.250254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.250145 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" Apr 16 19:01:55.250254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.250203 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="452713fe-7f9f-44bb-9af3-f22f1681bb54" containerName="kserve-container" Apr 16 19:01:55.253154 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.253138 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:01:55.269636 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.269609 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:01:55.287280 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.287258 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location\") pod \"isvc-sklearn-predictor-7bbf8748f4-4ljzb\" (UID: \"17db18c5-7965-48dc-a0fa-00546750fc62\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:01:55.387767 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.387736 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location\") pod \"isvc-sklearn-predictor-7bbf8748f4-4ljzb\" (UID: \"17db18c5-7965-48dc-a0fa-00546750fc62\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:01:55.388152 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.388133 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location\") pod \"isvc-sklearn-predictor-7bbf8748f4-4ljzb\" (UID: \"17db18c5-7965-48dc-a0fa-00546750fc62\") " pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:01:55.564020 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.563921 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:01:55.691653 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.691598 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:01:55.693971 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:01:55.693933 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17db18c5_7965_48dc_a0fa_00546750fc62.slice/crio-e89ba945f28895f81764202b6a1c66649f9662f4186d616f238a8a9896b9962f WatchSource:0}: Error finding container e89ba945f28895f81764202b6a1c66649f9662f4186d616f238a8a9896b9962f: Status 404 returned error can't find the container with id e89ba945f28895f81764202b6a1c66649f9662f4186d616f238a8a9896b9962f Apr 16 19:01:55.715585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:55.715560 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerStarted","Data":"e89ba945f28895f81764202b6a1c66649f9662f4186d616f238a8a9896b9962f"} Apr 16 19:01:56.449766 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:56.449715 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.48:8080/v2/models/isvc-predictive-lightgbm-v2/ready\": dial tcp 10.132.0.48:8080: connect: connection refused" Apr 16 19:01:56.719750 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:56.719663 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerStarted","Data":"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765"} Apr 16 19:01:57.723812 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:57.723774 2566 generic.go:358] "Generic (PLEG): container finished" podID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerID="d98ed50ec9d005048baba4a323a25f19f8f82baf3fae814199012543ed0a085d" exitCode=0 Apr 16 19:01:57.724238 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:57.723845 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerDied","Data":"d98ed50ec9d005048baba4a323a25f19f8f82baf3fae814199012543ed0a085d"} Apr 16 19:01:57.879776 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:57.879752 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:01:57.910144 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:57.910121 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location\") pod \"fc7c89e9-6c91-4dd9-8a49-9f170a750b60\" (UID: \"fc7c89e9-6c91-4dd9-8a49-9f170a750b60\") " Apr 16 19:01:57.910425 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:57.910401 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fc7c89e9-6c91-4dd9-8a49-9f170a750b60" (UID: "fc7c89e9-6c91-4dd9-8a49-9f170a750b60"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:01:58.011165 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.011091 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fc7c89e9-6c91-4dd9-8a49-9f170a750b60-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:01:58.728923 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.728823 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" event={"ID":"fc7c89e9-6c91-4dd9-8a49-9f170a750b60","Type":"ContainerDied","Data":"2f7b9d394b0e78821066f0837ff19abf6b2a35a7507e01a7c6ef465177e9e92b"} Apr 16 19:01:58.728923 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.728865 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv" Apr 16 19:01:58.729485 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.728874 2566 scope.go:117] "RemoveContainer" containerID="d98ed50ec9d005048baba4a323a25f19f8f82baf3fae814199012543ed0a085d" Apr 16 19:01:58.736703 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.736695 2566 scope.go:117] "RemoveContainer" containerID="c7e61e24fe2cb6b7a4a03396fcdc299525aad5af13b5962e12ea9bf839345833" Apr 16 19:01:58.748483 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.748462 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:01:58.753679 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:58.753660 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-predictive-lightgbm-v2-predictor-65df45cf79-jcddv"] Apr 16 19:01:59.734164 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:59.734132 2566 generic.go:358] "Generic (PLEG): container finished" podID="17db18c5-7965-48dc-a0fa-00546750fc62" containerID="493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765" exitCode=0 Apr 16 19:01:59.734533 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:01:59.734180 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerDied","Data":"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765"} Apr 16 19:02:00.451776 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:00.451746 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" path="/var/lib/kubelet/pods/fc7c89e9-6c91-4dd9-8a49-9f170a750b60/volumes" Apr 16 19:02:00.738874 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:00.738792 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerStarted","Data":"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235"} Apr 16 19:02:00.739261 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:00.739094 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:02:00.740449 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:00.740422 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:00.759481 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:00.759440 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podStartSLOduration=5.759427616 podStartE2EDuration="5.759427616s" podCreationTimestamp="2026-04-16 19:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:02:00.757107334 +0000 UTC m=+2696.849395301" watchObservedRunningTime="2026-04-16 19:02:00.759427616 +0000 UTC m=+2696.851715578" Apr 16 19:02:01.742986 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:01.742939 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:11.742908 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:11.742860 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:21.743434 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:21.743390 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:31.742854 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:31.742809 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:41.743378 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:41.743332 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:02:51.743227 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:02:51.743180 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:03:01.743314 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:01.743268 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: connect: connection refused" Apr 16 19:03:09.449360 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:09.449331 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:03:15.287463 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.287430 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:03:15.287856 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.287648 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" containerID="cri-o://728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235" gracePeriod=30 Apr 16 19:03:15.333420 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333389 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:03:15.333720 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333708 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" Apr 16 19:03:15.333767 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333722 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" Apr 16 19:03:15.333767 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333740 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="storage-initializer" Apr 16 19:03:15.333767 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333746 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="storage-initializer" Apr 16 19:03:15.333862 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.333809 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="fc7c89e9-6c91-4dd9-8a49-9f170a750b60" containerName="kserve-container" Apr 16 19:03:15.336773 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.336758 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:15.346099 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.346075 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:03:15.459092 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.459051 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-kc4j6\" (UID: \"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:15.560107 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.560017 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-kc4j6\" (UID: \"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:15.560404 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.560384 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location\") pod \"sklearn-v2-mlserver-predictor-695d5f5568-kc4j6\" (UID: \"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9\") " pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:15.648090 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.648057 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:15.766898 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.766870 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:03:15.769273 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:03:15.769246 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7d8ca1c_81ff_4e3b_9aec_4a9124143ba9.slice/crio-318fd7461cb89f31ab7fd268693c805311825c496f1628b84458004572e1fadd WatchSource:0}: Error finding container 318fd7461cb89f31ab7fd268693c805311825c496f1628b84458004572e1fadd: Status 404 returned error can't find the container with id 318fd7461cb89f31ab7fd268693c805311825c496f1628b84458004572e1fadd Apr 16 19:03:15.770984 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.770965 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:03:15.965202 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.965167 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerStarted","Data":"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9"} Apr 16 19:03:15.965202 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:15.965205 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerStarted","Data":"318fd7461cb89f31ab7fd268693c805311825c496f1628b84458004572e1fadd"} Apr 16 19:03:19.526986 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.526964 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:03:19.599729 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.599696 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location\") pod \"17db18c5-7965-48dc-a0fa-00546750fc62\" (UID: \"17db18c5-7965-48dc-a0fa-00546750fc62\") " Apr 16 19:03:19.600019 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.599974 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17db18c5-7965-48dc-a0fa-00546750fc62" (UID: "17db18c5-7965-48dc-a0fa-00546750fc62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:03:19.700851 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.700823 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17db18c5-7965-48dc-a0fa-00546750fc62-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:03:19.979192 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.979158 2566 generic.go:358] "Generic (PLEG): container finished" podID="17db18c5-7965-48dc-a0fa-00546750fc62" containerID="728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235" exitCode=0 Apr 16 19:03:19.979380 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.979231 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" Apr 16 19:03:19.979380 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.979242 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerDied","Data":"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235"} Apr 16 19:03:19.979380 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.979280 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" event={"ID":"17db18c5-7965-48dc-a0fa-00546750fc62","Type":"ContainerDied","Data":"e89ba945f28895f81764202b6a1c66649f9662f4186d616f238a8a9896b9962f"} Apr 16 19:03:19.979380 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.979296 2566 scope.go:117] "RemoveContainer" containerID="728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235" Apr 16 19:03:19.980601 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.980577 2566 generic.go:358] "Generic (PLEG): container finished" podID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerID="7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9" exitCode=0 Apr 16 19:03:19.980699 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.980627 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerDied","Data":"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9"} Apr 16 19:03:19.988657 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.988639 2566 scope.go:117] "RemoveContainer" containerID="493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765" Apr 16 19:03:19.995726 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.995709 2566 scope.go:117] "RemoveContainer" containerID="728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235" Apr 16 19:03:19.996025 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:03:19.995987 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235\": container with ID starting with 728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235 not found: ID does not exist" containerID="728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235" Apr 16 19:03:19.996087 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.996034 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235"} err="failed to get container status \"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235\": rpc error: code = NotFound desc = could not find container \"728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235\": container with ID starting with 728408c14e745f2205178c0bf026274092a737e933cb83a4409f336c5df41235 not found: ID does not exist" Apr 16 19:03:19.996087 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.996052 2566 scope.go:117] "RemoveContainer" containerID="493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765" Apr 16 19:03:19.996299 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:03:19.996280 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765\": container with ID starting with 493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765 not found: ID does not exist" containerID="493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765" Apr 16 19:03:19.996350 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:19.996305 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765"} err="failed to get container status \"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765\": rpc error: code = NotFound desc = could not find container \"493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765\": container with ID starting with 493215949cb9e8ffb51a36035665247d9469fe2f06654d3552600a93596b5765 not found: ID does not exist" Apr 16 19:03:20.015896 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.015869 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:03:20.020657 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.020636 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb"] Apr 16 19:03:20.448329 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.448285 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-predictor-7bbf8748f4-4ljzb" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.49:8080: i/o timeout" Apr 16 19:03:20.452214 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.452190 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" path="/var/lib/kubelet/pods/17db18c5-7965-48dc-a0fa-00546750fc62/volumes" Apr 16 19:03:20.986719 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.986684 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerStarted","Data":"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2"} Apr 16 19:03:20.987185 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:20.986880 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:03:21.009366 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:21.009324 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" podStartSLOduration=6.009310146 podStartE2EDuration="6.009310146s" podCreationTimestamp="2026-04-16 19:03:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:03:21.007776745 +0000 UTC m=+2777.100064710" watchObservedRunningTime="2026-04-16 19:03:21.009310146 +0000 UTC m=+2777.101598110" Apr 16 19:03:52.053832 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:03:52.053743 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:04:01.992141 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:01.992106 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:04:05.463581 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.463550 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:04:05.463962 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.463797 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" containerID="cri-o://26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2" gracePeriod=30 Apr 16 19:04:05.517187 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517159 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:05.517499 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517487 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="storage-initializer" Apr 16 19:04:05.517543 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517501 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="storage-initializer" Apr 16 19:04:05.517543 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517510 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" Apr 16 19:04:05.517543 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517515 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" Apr 16 19:04:05.517640 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.517575 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="17db18c5-7965-48dc-a0fa-00546750fc62" containerName="kserve-container" Apr 16 19:04:05.521457 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.521437 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:05.532013 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.531967 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:05.555043 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.554987 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-b7c8746d-bbnwf\" (UID: \"b45d0369-6759-4a9d-bb58-eac843698b34\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:05.656182 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.656149 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-b7c8746d-bbnwf\" (UID: \"b45d0369-6759-4a9d-bb58-eac843698b34\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:05.656526 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.656508 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location\") pod \"isvc-sklearn-runtime-predictor-b7c8746d-bbnwf\" (UID: \"b45d0369-6759-4a9d-bb58-eac843698b34\") " pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:05.832730 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.832642 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:05.950234 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:05.950209 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:05.952832 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:04:05.952802 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45d0369_6759_4a9d_bb58_eac843698b34.slice/crio-82bfec9db15d4359442e8fcccbfbada5692957da5f00a6855047491e93e96f8f WatchSource:0}: Error finding container 82bfec9db15d4359442e8fcccbfbada5692957da5f00a6855047491e93e96f8f: Status 404 returned error can't find the container with id 82bfec9db15d4359442e8fcccbfbada5692957da5f00a6855047491e93e96f8f Apr 16 19:04:06.127231 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:06.127194 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerStarted","Data":"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043"} Apr 16 19:04:06.127421 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:06.127238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerStarted","Data":"82bfec9db15d4359442e8fcccbfbada5692957da5f00a6855047491e93e96f8f"} Apr 16 19:04:11.989929 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:11.989886 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.50:8080/v2/models/sklearn-v2-mlserver/ready\": dial tcp 10.132.0.50:8080: connect: connection refused" Apr 16 19:04:12.148567 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:12.148535 2566 generic.go:358] "Generic (PLEG): container finished" podID="b45d0369-6759-4a9d-bb58-eac843698b34" containerID="8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043" exitCode=0 Apr 16 19:04:12.148737 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:12.148574 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerDied","Data":"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043"} Apr 16 19:04:13.105355 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.105333 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:04:13.153358 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.153329 2566 generic.go:358] "Generic (PLEG): container finished" podID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerID="26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2" exitCode=0 Apr 16 19:04:13.153529 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.153399 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" Apr 16 19:04:13.153529 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.153410 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerDied","Data":"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2"} Apr 16 19:04:13.153529 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.153450 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6" event={"ID":"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9","Type":"ContainerDied","Data":"318fd7461cb89f31ab7fd268693c805311825c496f1628b84458004572e1fadd"} Apr 16 19:04:13.153529 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.153469 2566 scope.go:117] "RemoveContainer" containerID="26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2" Apr 16 19:04:13.155142 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.155121 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerStarted","Data":"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a"} Apr 16 19:04:13.155416 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.155396 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:13.156736 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.156708 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 19:04:13.161595 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.161569 2566 scope.go:117] "RemoveContainer" containerID="7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9" Apr 16 19:04:13.168762 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.168746 2566 scope.go:117] "RemoveContainer" containerID="26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2" Apr 16 19:04:13.169056 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:04:13.169038 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2\": container with ID starting with 26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2 not found: ID does not exist" containerID="26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2" Apr 16 19:04:13.169113 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.169063 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2"} err="failed to get container status \"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2\": rpc error: code = NotFound desc = could not find container \"26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2\": container with ID starting with 26044aaea1442db6188e9d3b16f1d2dcfbbd2241b95ad09b286a18738063c3d2 not found: ID does not exist" Apr 16 19:04:13.169113 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.169087 2566 scope.go:117] "RemoveContainer" containerID="7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9" Apr 16 19:04:13.169337 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:04:13.169316 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9\": container with ID starting with 7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9 not found: ID does not exist" containerID="7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9" Apr 16 19:04:13.169415 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.169340 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9"} err="failed to get container status \"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9\": rpc error: code = NotFound desc = could not find container \"7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9\": container with ID starting with 7db218be18852acf2a57efa28b219cce9cb5e29de0925a075a289cecd137a4a9 not found: ID does not exist" Apr 16 19:04:13.175958 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.175913 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" podStartSLOduration=8.175897095 podStartE2EDuration="8.175897095s" podCreationTimestamp="2026-04-16 19:04:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:04:13.17489583 +0000 UTC m=+2829.267183795" watchObservedRunningTime="2026-04-16 19:04:13.175897095 +0000 UTC m=+2829.268185061" Apr 16 19:04:13.221748 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.221724 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location\") pod \"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9\" (UID: \"c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9\") " Apr 16 19:04:13.222086 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.222059 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" (UID: "c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:04:13.322621 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.322535 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:04:13.475399 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.475365 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:04:13.480979 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:13.480955 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/sklearn-v2-mlserver-predictor-695d5f5568-kc4j6"] Apr 16 19:04:14.159422 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:14.159384 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 19:04:14.452098 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:14.452021 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" path="/var/lib/kubelet/pods/c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9/volumes" Apr 16 19:04:24.160341 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:24.160300 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.51:8080: connect: connection refused" Apr 16 19:04:34.160543 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:34.160510 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:42.551230 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.551200 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-runtime-predictor-b7c8746d-bbnwf_b45d0369-6759-4a9d-bb58-eac843698b34/kserve-container/0.log" Apr 16 19:04:42.703836 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.703802 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:42.704130 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.704090 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" containerID="cri-o://73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a" gracePeriod=30 Apr 16 19:04:42.761465 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761430 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:04:42.761790 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761777 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" Apr 16 19:04:42.761834 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761792 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" Apr 16 19:04:42.761834 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761815 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="storage-initializer" Apr 16 19:04:42.761834 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761820 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="storage-initializer" Apr 16 19:04:42.761938 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.761886 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="c7d8ca1c-81ff-4e3b-9aec-4a9124143ba9" containerName="kserve-container" Apr 16 19:04:42.764785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.764763 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:42.772971 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.772941 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:04:42.777714 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.777687 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9\" (UID: \"bc26b3ea-e1ae-46bb-9084-360936904978\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:42.878356 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.878325 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9\" (UID: \"bc26b3ea-e1ae-46bb-9084-360936904978\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:42.878704 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:42.878683 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location\") pod \"isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9\" (UID: \"bc26b3ea-e1ae-46bb-9084-360936904978\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:43.076039 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.075978 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:43.201202 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.201165 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:04:43.204389 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:04:43.204351 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc26b3ea_e1ae_46bb_9084_360936904978.slice/crio-5d32653abe6a6cb0d9e3d469d09a4b2f0838865739c048179542f537f7968f05 WatchSource:0}: Error finding container 5d32653abe6a6cb0d9e3d469d09a4b2f0838865739c048179542f537f7968f05: Status 404 returned error can't find the container with id 5d32653abe6a6cb0d9e3d469d09a4b2f0838865739c048179542f537f7968f05 Apr 16 19:04:43.244055 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.244025 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerStarted","Data":"5d32653abe6a6cb0d9e3d469d09a4b2f0838865739c048179542f537f7968f05"} Apr 16 19:04:43.832059 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.832035 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:43.886681 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.886655 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location\") pod \"b45d0369-6759-4a9d-bb58-eac843698b34\" (UID: \"b45d0369-6759-4a9d-bb58-eac843698b34\") " Apr 16 19:04:43.910018 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.909976 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b45d0369-6759-4a9d-bb58-eac843698b34" (UID: "b45d0369-6759-4a9d-bb58-eac843698b34"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:04:43.987596 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:43.987520 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b45d0369-6759-4a9d-bb58-eac843698b34-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:04:44.248162 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.248069 2566 generic.go:358] "Generic (PLEG): container finished" podID="b45d0369-6759-4a9d-bb58-eac843698b34" containerID="73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a" exitCode=0 Apr 16 19:04:44.248322 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.248155 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerDied","Data":"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a"} Apr 16 19:04:44.248322 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.248172 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" Apr 16 19:04:44.248322 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.248197 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf" event={"ID":"b45d0369-6759-4a9d-bb58-eac843698b34","Type":"ContainerDied","Data":"82bfec9db15d4359442e8fcccbfbada5692957da5f00a6855047491e93e96f8f"} Apr 16 19:04:44.248322 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.248218 2566 scope.go:117] "RemoveContainer" containerID="73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a" Apr 16 19:04:44.249670 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.249642 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerStarted","Data":"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945"} Apr 16 19:04:44.256843 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.256827 2566 scope.go:117] "RemoveContainer" containerID="8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043" Apr 16 19:04:44.263832 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.263804 2566 scope.go:117] "RemoveContainer" containerID="73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a" Apr 16 19:04:44.264088 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:04:44.264055 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a\": container with ID starting with 73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a not found: ID does not exist" containerID="73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a" Apr 16 19:04:44.264143 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.264095 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a"} err="failed to get container status \"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a\": rpc error: code = NotFound desc = could not find container \"73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a\": container with ID starting with 73245f33060e7ad84082f60931c09e5a1d0f41f12b1d6275aa8c6b89c64c280a not found: ID does not exist" Apr 16 19:04:44.264143 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.264112 2566 scope.go:117] "RemoveContainer" containerID="8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043" Apr 16 19:04:44.264363 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:04:44.264343 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043\": container with ID starting with 8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043 not found: ID does not exist" containerID="8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043" Apr 16 19:04:44.264413 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.264370 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043"} err="failed to get container status \"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043\": rpc error: code = NotFound desc = could not find container \"8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043\": container with ID starting with 8a0ce3cda49be5512e9bc40a2e05bced995e3beb2026cbafd4117aa9f3b27043 not found: ID does not exist" Apr 16 19:04:44.292811 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.292788 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:44.299360 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.299341 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-runtime-predictor-b7c8746d-bbnwf"] Apr 16 19:04:44.452281 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:44.452248 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" path="/var/lib/kubelet/pods/b45d0369-6759-4a9d-bb58-eac843698b34/volumes" Apr 16 19:04:47.265559 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:47.265478 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc26b3ea-e1ae-46bb-9084-360936904978" containerID="efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945" exitCode=0 Apr 16 19:04:47.265559 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:47.265514 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerDied","Data":"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945"} Apr 16 19:04:48.270377 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:48.270338 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerStarted","Data":"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78"} Apr 16 19:04:48.270949 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:48.270557 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:04:48.290275 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:04:48.290231 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" podStartSLOduration=6.29021964 podStartE2EDuration="6.29021964s" podCreationTimestamp="2026-04-16 19:04:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:04:48.288126268 +0000 UTC m=+2864.380414233" watchObservedRunningTime="2026-04-16 19:04:48.29021964 +0000 UTC m=+2864.382507605" Apr 16 19:05:19.354416 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:19.354368 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:05:29.276941 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:29.276903 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:05:32.887669 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.887630 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:05:32.888143 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.887872 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" containerID="cri-o://28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78" gracePeriod=30 Apr 16 19:05:32.937151 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937116 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:05:32.937455 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937443 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" Apr 16 19:05:32.937499 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937456 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" Apr 16 19:05:32.937499 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937474 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="storage-initializer" Apr 16 19:05:32.937499 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937479 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="storage-initializer" Apr 16 19:05:32.937599 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.937528 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="b45d0369-6759-4a9d-bb58-eac843698b34" containerName="kserve-container" Apr 16 19:05:32.939412 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.939394 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:32.947869 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:32.947844 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:05:33.090370 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.090336 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9bcff48d6-tmdbz\" (UID: \"d5328b66-3796-469a-b73a-e6040b0e3c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:33.191757 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.191668 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9bcff48d6-tmdbz\" (UID: \"d5328b66-3796-469a-b73a-e6040b0e3c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:33.192086 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.192070 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location\") pod \"isvc-sklearn-v2-predictor-9bcff48d6-tmdbz\" (UID: \"d5328b66-3796-469a-b73a-e6040b0e3c3b\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:33.250579 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.250549 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:33.373515 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.373488 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:05:33.376587 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:05:33.376559 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5328b66_3796_469a_b73a_e6040b0e3c3b.slice/crio-08c58244b42bca7c675fb676fb191f378ab51f244af7d26f2590c72d3b1af652 WatchSource:0}: Error finding container 08c58244b42bca7c675fb676fb191f378ab51f244af7d26f2590c72d3b1af652: Status 404 returned error can't find the container with id 08c58244b42bca7c675fb676fb191f378ab51f244af7d26f2590c72d3b1af652 Apr 16 19:05:33.405814 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:33.405789 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerStarted","Data":"08c58244b42bca7c675fb676fb191f378ab51f244af7d26f2590c72d3b1af652"} Apr 16 19:05:34.409642 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:34.409597 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerStarted","Data":"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8"} Apr 16 19:05:37.420242 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:37.420203 2566 generic.go:358] "Generic (PLEG): container finished" podID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerID="5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8" exitCode=0 Apr 16 19:05:37.420672 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:37.420276 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerDied","Data":"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8"} Apr 16 19:05:38.425432 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:38.425398 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerStarted","Data":"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1"} Apr 16 19:05:38.425857 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:38.425682 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:05:38.426852 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:38.426829 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:05:38.445652 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:38.445603 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podStartSLOduration=6.445587311 podStartE2EDuration="6.445587311s" podCreationTimestamp="2026-04-16 19:05:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:05:38.444906432 +0000 UTC m=+2914.537194396" watchObservedRunningTime="2026-04-16 19:05:38.445587311 +0000 UTC m=+2914.537875278" Apr 16 19:05:39.275424 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:39.275382 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.52:8080/v2/models/isvc-sklearn-v2-runtime/ready\": dial tcp 10.132.0.52:8080: connect: connection refused" Apr 16 19:05:39.429343 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:39.429304 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:05:40.025747 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.025722 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:05:40.046209 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.046185 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location\") pod \"bc26b3ea-e1ae-46bb-9084-360936904978\" (UID: \"bc26b3ea-e1ae-46bb-9084-360936904978\") " Apr 16 19:05:40.046503 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.046485 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bc26b3ea-e1ae-46bb-9084-360936904978" (UID: "bc26b3ea-e1ae-46bb-9084-360936904978"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:05:40.147435 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.147347 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bc26b3ea-e1ae-46bb-9084-360936904978-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:05:40.433487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.433390 2566 generic.go:358] "Generic (PLEG): container finished" podID="bc26b3ea-e1ae-46bb-9084-360936904978" containerID="28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78" exitCode=0 Apr 16 19:05:40.433487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.433430 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerDied","Data":"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78"} Apr 16 19:05:40.433487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.433459 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" event={"ID":"bc26b3ea-e1ae-46bb-9084-360936904978","Type":"ContainerDied","Data":"5d32653abe6a6cb0d9e3d469d09a4b2f0838865739c048179542f537f7968f05"} Apr 16 19:05:40.433487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.433463 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9" Apr 16 19:05:40.433487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.433477 2566 scope.go:117] "RemoveContainer" containerID="28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78" Apr 16 19:05:40.441690 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.441672 2566 scope.go:117] "RemoveContainer" containerID="efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945" Apr 16 19:05:40.448691 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.448663 2566 scope.go:117] "RemoveContainer" containerID="28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78" Apr 16 19:05:40.448962 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:05:40.448939 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78\": container with ID starting with 28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78 not found: ID does not exist" containerID="28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78" Apr 16 19:05:40.449056 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.448972 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78"} err="failed to get container status \"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78\": rpc error: code = NotFound desc = could not find container \"28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78\": container with ID starting with 28b11800777d4153b845ef4f0893a4682f5fecc6affa1669154f782355404d78 not found: ID does not exist" Apr 16 19:05:40.449056 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.449004 2566 scope.go:117] "RemoveContainer" containerID="efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945" Apr 16 19:05:40.449258 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:05:40.449237 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945\": container with ID starting with efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945 not found: ID does not exist" containerID="efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945" Apr 16 19:05:40.449301 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.449268 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945"} err="failed to get container status \"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945\": rpc error: code = NotFound desc = could not find container \"efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945\": container with ID starting with efbfa4801232f979798928bb960041f23867720fc4052ec83e05a7654c136945 not found: ID does not exist" Apr 16 19:05:40.457679 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.457656 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:05:40.461709 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:40.461685 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-runtime-predictor-7dd9b85c64-9fgz9"] Apr 16 19:05:42.454553 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:42.454517 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" path="/var/lib/kubelet/pods/bc26b3ea-e1ae-46bb-9084-360936904978/volumes" Apr 16 19:05:49.429949 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:49.429903 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:05:59.429608 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:05:59.429563 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:06:09.429821 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:09.429768 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:06:19.429952 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:19.429906 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:06:29.429619 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:29.429530 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:06:39.430216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:39.430163 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.53:8080: connect: connection refused" Apr 16 19:06:49.430627 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:49.430593 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:06:53.186184 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.186151 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:06:53.187115 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.187057 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" containerID="cri-o://8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1" gracePeriod=30 Apr 16 19:06:53.249077 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249035 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:06:53.249451 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249433 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="storage-initializer" Apr 16 19:06:53.249535 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249452 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="storage-initializer" Apr 16 19:06:53.249535 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249468 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" Apr 16 19:06:53.249535 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249476 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" Apr 16 19:06:53.249700 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.249570 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bc26b3ea-e1ae-46bb-9084-360936904978" containerName="kserve-container" Apr 16 19:06:53.252724 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.252704 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:53.261388 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.261364 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:06:53.358986 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.358946 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk\" (UID: \"7b469cb4-d206-4147-8b90-6bb6e89a22f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:53.460483 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.460387 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk\" (UID: \"7b469cb4-d206-4147-8b90-6bb6e89a22f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:53.460735 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.460714 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location\") pod \"isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk\" (UID: \"7b469cb4-d206-4147-8b90-6bb6e89a22f8\") " pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:53.563189 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.563161 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:53.682454 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:53.682432 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:06:53.684629 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:06:53.684602 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b469cb4_d206_4147_8b90_6bb6e89a22f8.slice/crio-754d24fd492a7cb64f7c8d3555b07064b74abfad584e856f92b6f7b7abe3683b WatchSource:0}: Error finding container 754d24fd492a7cb64f7c8d3555b07064b74abfad584e856f92b6f7b7abe3683b: Status 404 returned error can't find the container with id 754d24fd492a7cb64f7c8d3555b07064b74abfad584e856f92b6f7b7abe3683b Apr 16 19:06:54.647351 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:54.647310 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerStarted","Data":"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1"} Apr 16 19:06:54.647351 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:54.647355 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerStarted","Data":"754d24fd492a7cb64f7c8d3555b07064b74abfad584e856f92b6f7b7abe3683b"} Apr 16 19:06:57.437762 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.437734 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:06:57.495588 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.495458 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location\") pod \"d5328b66-3796-469a-b73a-e6040b0e3c3b\" (UID: \"d5328b66-3796-469a-b73a-e6040b0e3c3b\") " Apr 16 19:06:57.495824 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.495799 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d5328b66-3796-469a-b73a-e6040b0e3c3b" (UID: "d5328b66-3796-469a-b73a-e6040b0e3c3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:06:57.596653 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.596621 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d5328b66-3796-469a-b73a-e6040b0e3c3b-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:06:57.658979 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.658949 2566 generic.go:358] "Generic (PLEG): container finished" podID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerID="54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1" exitCode=0 Apr 16 19:06:57.659132 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.659023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerDied","Data":"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1"} Apr 16 19:06:57.660515 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.660491 2566 generic.go:358] "Generic (PLEG): container finished" podID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerID="8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1" exitCode=0 Apr 16 19:06:57.660628 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.660539 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerDied","Data":"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1"} Apr 16 19:06:57.660628 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.660548 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" Apr 16 19:06:57.660628 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.660563 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz" event={"ID":"d5328b66-3796-469a-b73a-e6040b0e3c3b","Type":"ContainerDied","Data":"08c58244b42bca7c675fb676fb191f378ab51f244af7d26f2590c72d3b1af652"} Apr 16 19:06:57.660628 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.660582 2566 scope.go:117] "RemoveContainer" containerID="8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1" Apr 16 19:06:57.668730 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.668706 2566 scope.go:117] "RemoveContainer" containerID="5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8" Apr 16 19:06:57.675716 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.675694 2566 scope.go:117] "RemoveContainer" containerID="8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1" Apr 16 19:06:57.676006 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:06:57.675969 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1\": container with ID starting with 8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1 not found: ID does not exist" containerID="8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1" Apr 16 19:06:57.676081 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.676013 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1"} err="failed to get container status \"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1\": rpc error: code = NotFound desc = could not find container \"8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1\": container with ID starting with 8ec3e1994337628d338da10ec03fe35da57dc583003808a7ce8af8d5176138c1 not found: ID does not exist" Apr 16 19:06:57.676081 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.676031 2566 scope.go:117] "RemoveContainer" containerID="5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8" Apr 16 19:06:57.676273 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:06:57.676255 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8\": container with ID starting with 5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8 not found: ID does not exist" containerID="5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8" Apr 16 19:06:57.676339 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.676278 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8"} err="failed to get container status \"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8\": rpc error: code = NotFound desc = could not find container \"5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8\": container with ID starting with 5b286d0fbe60394a07014ed3d1f73144c6b01ae441968e63313834173b64f5a8 not found: ID does not exist" Apr 16 19:06:57.692067 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.692047 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:06:57.696972 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:57.696948 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-predictor-9bcff48d6-tmdbz"] Apr 16 19:06:58.452466 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:58.452425 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" path="/var/lib/kubelet/pods/d5328b66-3796-469a-b73a-e6040b0e3c3b/volumes" Apr 16 19:06:58.666058 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:58.666020 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerStarted","Data":"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c"} Apr 16 19:06:58.666418 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:58.666384 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:06:58.667646 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:58.667616 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:06:58.684092 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:58.684015 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podStartSLOduration=5.683976601 podStartE2EDuration="5.683976601s" podCreationTimestamp="2026-04-16 19:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:06:58.683869927 +0000 UTC m=+2994.776157894" watchObservedRunningTime="2026-04-16 19:06:58.683976601 +0000 UTC m=+2994.776264566" Apr 16 19:06:59.670455 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:06:59.670414 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:09.671216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:09.671176 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:19.670717 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:19.670672 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:29.671445 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:29.671401 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:39.670780 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:39.670735 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:49.670814 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:49.670775 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:07:59.670610 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:07:59.670519 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:08:07.449799 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:07.449764 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:08:13.376757 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.376727 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:08:13.379160 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.377006 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" containerID="cri-o://3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c" gracePeriod=30 Apr 16 19:08:13.444672 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.444637 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:08:13.445016 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.444987 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="storage-initializer" Apr 16 19:08:13.445016 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.445016 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="storage-initializer" Apr 16 19:08:13.445118 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.445033 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" Apr 16 19:08:13.445118 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.445039 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" Apr 16 19:08:13.445118 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.445102 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5328b66-3796-469a-b73a-e6040b0e3c3b" containerName="kserve-container" Apr 16 19:08:13.448023 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.448006 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:13.458165 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.458143 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:08:13.537796 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.537766 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-dffbv\" (UID: \"7c22612d-5afb-47bf-987d-45ab7e64ddfc\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:13.639122 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.639034 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-dffbv\" (UID: \"7c22612d-5afb-47bf-987d-45ab7e64ddfc\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:13.639484 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.639462 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location\") pod \"isvc-tensorflow-predictor-88f6f6cb7-dffbv\" (UID: \"7c22612d-5afb-47bf-987d-45ab7e64ddfc\") " pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:13.758178 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.758141 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:13.877365 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.877334 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:08:13.880972 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:08:13.880941 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c22612d_5afb_47bf_987d_45ab7e64ddfc.slice/crio-049f62ebe5d7d466c1fb9d663a904e80cc60f74ee7191b16e0770a12e445b3dd WatchSource:0}: Error finding container 049f62ebe5d7d466c1fb9d663a904e80cc60f74ee7191b16e0770a12e445b3dd: Status 404 returned error can't find the container with id 049f62ebe5d7d466c1fb9d663a904e80cc60f74ee7191b16e0770a12e445b3dd Apr 16 19:08:13.894675 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:13.894615 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerStarted","Data":"049f62ebe5d7d466c1fb9d663a904e80cc60f74ee7191b16e0770a12e445b3dd"} Apr 16 19:08:14.899071 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:14.899030 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerStarted","Data":"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4"} Apr 16 19:08:17.449074 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.449024 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.54:8080: connect: connection refused" Apr 16 19:08:17.723060 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.723038 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:08:17.878546 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.878502 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location\") pod \"7b469cb4-d206-4147-8b90-6bb6e89a22f8\" (UID: \"7b469cb4-d206-4147-8b90-6bb6e89a22f8\") " Apr 16 19:08:17.878831 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.878807 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7b469cb4-d206-4147-8b90-6bb6e89a22f8" (UID: "7b469cb4-d206-4147-8b90-6bb6e89a22f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:08:17.910595 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.910568 2566 generic.go:358] "Generic (PLEG): container finished" podID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerID="3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c" exitCode=0 Apr 16 19:08:17.910714 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.910680 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerDied","Data":"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c"} Apr 16 19:08:17.910714 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.910701 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" Apr 16 19:08:17.910785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.910716 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk" event={"ID":"7b469cb4-d206-4147-8b90-6bb6e89a22f8","Type":"ContainerDied","Data":"754d24fd492a7cb64f7c8d3555b07064b74abfad584e856f92b6f7b7abe3683b"} Apr 16 19:08:17.910785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.910736 2566 scope.go:117] "RemoveContainer" containerID="3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c" Apr 16 19:08:17.919805 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.919788 2566 scope.go:117] "RemoveContainer" containerID="54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1" Apr 16 19:08:17.927362 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.927349 2566 scope.go:117] "RemoveContainer" containerID="3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c" Apr 16 19:08:17.927591 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:08:17.927575 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c\": container with ID starting with 3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c not found: ID does not exist" containerID="3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c" Apr 16 19:08:17.927635 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.927598 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c"} err="failed to get container status \"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c\": rpc error: code = NotFound desc = could not find container \"3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c\": container with ID starting with 3fd7cb9fba7f40315ba4ca34fd828137687bdd8cebe68c3c6aa801792694736c not found: ID does not exist" Apr 16 19:08:17.927635 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.927615 2566 scope.go:117] "RemoveContainer" containerID="54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1" Apr 16 19:08:17.927811 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:08:17.927796 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1\": container with ID starting with 54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1 not found: ID does not exist" containerID="54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1" Apr 16 19:08:17.927855 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.927815 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1"} err="failed to get container status \"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1\": rpc error: code = NotFound desc = could not find container \"54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1\": container with ID starting with 54a211e6303f0fb17a341507f1d3d04d1b642ca50ffb16870f357de881bdd9d1 not found: ID does not exist" Apr 16 19:08:17.937348 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.937328 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:08:17.942024 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.941985 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-v2-mixed-predictor-7b446468df-vvmqk"] Apr 16 19:08:17.979712 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:17.979653 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7b469cb4-d206-4147-8b90-6bb6e89a22f8-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:08:18.453627 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:18.453589 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" path="/var/lib/kubelet/pods/7b469cb4-d206-4147-8b90-6bb6e89a22f8/volumes" Apr 16 19:08:18.915794 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:18.915761 2566 generic.go:358] "Generic (PLEG): container finished" podID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerID="abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4" exitCode=0 Apr 16 19:08:18.915963 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:18.915816 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerDied","Data":"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4"} Apr 16 19:08:18.916907 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:18.916890 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:08:22.935986 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:22.935952 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerStarted","Data":"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645"} Apr 16 19:08:22.936386 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:22.936314 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:08:22.937468 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:22.937442 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 16 19:08:22.959353 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:22.959310 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" podStartSLOduration=6.103614743 podStartE2EDuration="9.959299121s" podCreationTimestamp="2026-04-16 19:08:13 +0000 UTC" firstStartedPulling="2026-04-16 19:08:18.9170321 +0000 UTC m=+3075.009320043" lastFinishedPulling="2026-04-16 19:08:22.772716469 +0000 UTC m=+3078.865004421" observedRunningTime="2026-04-16 19:08:22.958540036 +0000 UTC m=+3079.050828073" watchObservedRunningTime="2026-04-16 19:08:22.959299121 +0000 UTC m=+3079.051587087" Apr 16 19:08:23.939957 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:23.939919 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 16 19:08:33.940158 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:33.940116 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.55:8080: connect: connection refused" Apr 16 19:08:43.941619 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:08:43.941586 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:09:04.322635 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.319038 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:09:04.322635 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.319566 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" containerID="cri-o://12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645" gracePeriod=30 Apr 16 19:09:04.423078 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423045 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:09:04.423452 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423438 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="storage-initializer" Apr 16 19:09:04.423501 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423454 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="storage-initializer" Apr 16 19:09:04.423501 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423470 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" Apr 16 19:09:04.423501 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423476 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" Apr 16 19:09:04.423593 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.423531 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7b469cb4-d206-4147-8b90-6bb6e89a22f8" containerName="kserve-container" Apr 16 19:09:04.426740 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.426721 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:04.436373 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.436346 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:09:04.536386 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.536349 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g\" (UID: \"bef2297c-35b4-4190-b7f5-b78b8d163c82\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:04.637860 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.637829 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g\" (UID: \"bef2297c-35b4-4190-b7f5-b78b8d163c82\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:04.638227 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.638208 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location\") pod \"isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g\" (UID: \"bef2297c-35b4-4190-b7f5-b78b8d163c82\") " pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:04.738627 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.738591 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:04.866862 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:04.866831 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:09:04.868144 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:09:04.868107 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbef2297c_35b4_4190_b7f5_b78b8d163c82.slice/crio-4a6415f80857777113cf0be570a515e2dee44a8bb0d7e4b6677784785f32cb8a WatchSource:0}: Error finding container 4a6415f80857777113cf0be570a515e2dee44a8bb0d7e4b6677784785f32cb8a: Status 404 returned error can't find the container with id 4a6415f80857777113cf0be570a515e2dee44a8bb0d7e4b6677784785f32cb8a Apr 16 19:09:05.068259 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:05.068217 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerStarted","Data":"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e"} Apr 16 19:09:05.068259 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:05.068255 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerStarted","Data":"4a6415f80857777113cf0be570a515e2dee44a8bb0d7e4b6677784785f32cb8a"} Apr 16 19:09:14.099664 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:14.099632 2566 generic.go:358] "Generic (PLEG): container finished" podID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerID="283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e" exitCode=0 Apr 16 19:09:14.100095 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:14.099716 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerDied","Data":"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e"} Apr 16 19:09:15.104015 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:15.103965 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerStarted","Data":"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b"} Apr 16 19:09:15.104395 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:15.104260 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:15.105608 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:15.105581 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 19:09:15.131443 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:15.131406 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" podStartSLOduration=11.131391049 podStartE2EDuration="11.131391049s" podCreationTimestamp="2026-04-16 19:09:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:09:15.130431872 +0000 UTC m=+3131.222719838" watchObservedRunningTime="2026-04-16 19:09:15.131391049 +0000 UTC m=+3131.223679014" Apr 16 19:09:16.107900 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:16.107856 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.56:8080: connect: connection refused" Apr 16 19:09:26.109228 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:26.109146 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:09:34.967394 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:34.967372 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:09:35.098034 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.097982 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location\") pod \"7c22612d-5afb-47bf-987d-45ab7e64ddfc\" (UID: \"7c22612d-5afb-47bf-987d-45ab7e64ddfc\") " Apr 16 19:09:35.108618 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.108589 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7c22612d-5afb-47bf-987d-45ab7e64ddfc" (UID: "7c22612d-5afb-47bf-987d-45ab7e64ddfc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:09:35.166632 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.166603 2566 generic.go:358] "Generic (PLEG): container finished" podID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerID="12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645" exitCode=137 Apr 16 19:09:35.166765 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.166669 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" Apr 16 19:09:35.166765 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.166681 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerDied","Data":"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645"} Apr 16 19:09:35.166765 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.166718 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv" event={"ID":"7c22612d-5afb-47bf-987d-45ab7e64ddfc","Type":"ContainerDied","Data":"049f62ebe5d7d466c1fb9d663a904e80cc60f74ee7191b16e0770a12e445b3dd"} Apr 16 19:09:35.166765 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.166734 2566 scope.go:117] "RemoveContainer" containerID="12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645" Apr 16 19:09:35.174641 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.174624 2566 scope.go:117] "RemoveContainer" containerID="abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4" Apr 16 19:09:35.181456 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.181440 2566 scope.go:117] "RemoveContainer" containerID="12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645" Apr 16 19:09:35.181700 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:09:35.181683 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645\": container with ID starting with 12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645 not found: ID does not exist" containerID="12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645" Apr 16 19:09:35.181739 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.181709 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645"} err="failed to get container status \"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645\": rpc error: code = NotFound desc = could not find container \"12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645\": container with ID starting with 12ce3fd29e2627176a3b771078464aa21b74e2d1c45834d4317ec9f0ee986645 not found: ID does not exist" Apr 16 19:09:35.181739 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.181727 2566 scope.go:117] "RemoveContainer" containerID="abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4" Apr 16 19:09:35.181954 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:09:35.181939 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4\": container with ID starting with abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4 not found: ID does not exist" containerID="abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4" Apr 16 19:09:35.182024 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.181967 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4"} err="failed to get container status \"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4\": rpc error: code = NotFound desc = could not find container \"abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4\": container with ID starting with abad2d5db3066f9cba6442343de93fa982d4f8f138560d11b51f1df3c4a714e4 not found: ID does not exist" Apr 16 19:09:35.189832 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.189811 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:09:35.194226 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.194205 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-predictor-88f6f6cb7-dffbv"] Apr 16 19:09:35.198817 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:35.198796 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7c22612d-5afb-47bf-987d-45ab7e64ddfc-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:09:36.453069 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:36.453031 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" path="/var/lib/kubelet/pods/7c22612d-5afb-47bf-987d-45ab7e64ddfc/volumes" Apr 16 19:09:45.317126 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.317089 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:09:45.317598 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.317362 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" containerID="cri-o://bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b" gracePeriod=30 Apr 16 19:09:45.370008 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.369964 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:09:45.370396 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.370378 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="storage-initializer" Apr 16 19:09:45.370487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.370398 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="storage-initializer" Apr 16 19:09:45.370487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.370413 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" Apr 16 19:09:45.370487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.370421 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" Apr 16 19:09:45.370655 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.370511 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="7c22612d-5afb-47bf-987d-45ab7e64ddfc" containerName="kserve-container" Apr 16 19:09:45.372455 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.372436 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:09:45.382040 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.381411 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ql5ct\" (UID: \"58a27a89-3dbd-4887-98e8-a30f0bbf9b39\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:09:45.383058 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.383037 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:09:45.482240 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.482211 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ql5ct\" (UID: \"58a27a89-3dbd-4887-98e8-a30f0bbf9b39\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:09:45.482537 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.482518 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location\") pod \"isvc-triton-predictor-85f9f46646-ql5ct\" (UID: \"58a27a89-3dbd-4887-98e8-a30f0bbf9b39\") " pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:09:45.682538 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.682507 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:09:45.809423 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:45.809367 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:09:45.812192 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:09:45.812167 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58a27a89_3dbd_4887_98e8_a30f0bbf9b39.slice/crio-557b90be9403ebe707ec3de47e7b872b92c71b6d096056dfe03dece924749391 WatchSource:0}: Error finding container 557b90be9403ebe707ec3de47e7b872b92c71b6d096056dfe03dece924749391: Status 404 returned error can't find the container with id 557b90be9403ebe707ec3de47e7b872b92c71b6d096056dfe03dece924749391 Apr 16 19:09:46.199346 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:46.199314 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerStarted","Data":"6f2bf6b360fa29ed0b0736af2fa9096f38c07ad2f52ff4b41d357d9c3583f4b1"} Apr 16 19:09:46.199346 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:46.199352 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerStarted","Data":"557b90be9403ebe707ec3de47e7b872b92c71b6d096056dfe03dece924749391"} Apr 16 19:09:50.213706 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:50.213671 2566 generic.go:358] "Generic (PLEG): container finished" podID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerID="6f2bf6b360fa29ed0b0736af2fa9096f38c07ad2f52ff4b41d357d9c3583f4b1" exitCode=0 Apr 16 19:09:50.214192 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:09:50.213748 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerDied","Data":"6f2bf6b360fa29ed0b0736af2fa9096f38c07ad2f52ff4b41d357d9c3583f4b1"} Apr 16 19:10:16.008094 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.008066 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:10:16.169632 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.169534 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location\") pod \"bef2297c-35b4-4190-b7f5-b78b8d163c82\" (UID: \"bef2297c-35b4-4190-b7f5-b78b8d163c82\") " Apr 16 19:10:16.174305 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.174255 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bef2297c-35b4-4190-b7f5-b78b8d163c82" (UID: "bef2297c-35b4-4190-b7f5-b78b8d163c82"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:10:16.270761 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.270720 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bef2297c-35b4-4190-b7f5-b78b8d163c82-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:10:16.338098 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.338045 2566 generic.go:358] "Generic (PLEG): container finished" podID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerID="bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b" exitCode=137 Apr 16 19:10:16.338274 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.338129 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerDied","Data":"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b"} Apr 16 19:10:16.338274 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.338143 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" Apr 16 19:10:16.338274 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.338162 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g" event={"ID":"bef2297c-35b4-4190-b7f5-b78b8d163c82","Type":"ContainerDied","Data":"4a6415f80857777113cf0be570a515e2dee44a8bb0d7e4b6677784785f32cb8a"} Apr 16 19:10:16.338274 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.338180 2566 scope.go:117] "RemoveContainer" containerID="bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b" Apr 16 19:10:16.350760 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.350107 2566 scope.go:117] "RemoveContainer" containerID="283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e" Apr 16 19:10:16.361498 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.361482 2566 scope.go:117] "RemoveContainer" containerID="bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b" Apr 16 19:10:16.361965 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:10:16.361937 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b\": container with ID starting with bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b not found: ID does not exist" containerID="bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b" Apr 16 19:10:16.362089 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.362036 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b"} err="failed to get container status \"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b\": rpc error: code = NotFound desc = could not find container \"bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b\": container with ID starting with bb6bd0539d2fb39f5af19fc39ccdd74463f0cf6594560113cf86f8d85c2dd29b not found: ID does not exist" Apr 16 19:10:16.362089 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.362062 2566 scope.go:117] "RemoveContainer" containerID="283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e" Apr 16 19:10:16.362438 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:10:16.362413 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e\": container with ID starting with 283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e not found: ID does not exist" containerID="283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e" Apr 16 19:10:16.362541 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.362448 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e"} err="failed to get container status \"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e\": rpc error: code = NotFound desc = could not find container \"283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e\": container with ID starting with 283b0eed1ec9073277437de5b0614c4573c370beca0be5ec932b0d1769c6432e not found: ID does not exist" Apr 16 19:10:16.367539 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.367514 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:10:16.372485 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.372463 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-tensorflow-runtime-predictor-854bdff69c-f7c8g"] Apr 16 19:10:16.454132 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:10:16.454050 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" path="/var/lib/kubelet/pods/bef2297c-35b4-4190-b7f5-b78b8d163c82/volumes" Apr 16 19:11:45.637716 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:45.637679 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerStarted","Data":"245d9edcb0d4cdf01f89e1e1510cc836a12596c5fea8205872fc5a0988cfa3d0"} Apr 16 19:11:45.638132 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:45.637901 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:11:45.639139 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:45.639115 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 19:11:45.656523 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:45.656478 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" podStartSLOduration=5.780503402 podStartE2EDuration="2m0.656466982s" podCreationTimestamp="2026-04-16 19:09:45 +0000 UTC" firstStartedPulling="2026-04-16 19:09:50.214896014 +0000 UTC m=+3166.307183957" lastFinishedPulling="2026-04-16 19:11:45.09085959 +0000 UTC m=+3281.183147537" observedRunningTime="2026-04-16 19:11:45.65471233 +0000 UTC m=+3281.747000295" watchObservedRunningTime="2026-04-16 19:11:45.656466982 +0000 UTC m=+3281.748754949" Apr 16 19:11:46.641005 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:46.640954 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.57:8080: connect: connection refused" Apr 16 19:11:56.641886 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:11:56.641849 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:12:06.884360 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.884325 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:12:06.884805 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.884622 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" containerID="cri-o://245d9edcb0d4cdf01f89e1e1510cc836a12596c5fea8205872fc5a0988cfa3d0" gracePeriod=30 Apr 16 19:12:06.966520 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966438 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:12:06.966812 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966798 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" Apr 16 19:12:06.966854 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966815 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" Apr 16 19:12:06.966854 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966833 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="storage-initializer" Apr 16 19:12:06.966854 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966839 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="storage-initializer" Apr 16 19:12:06.966950 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.966897 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bef2297c-35b4-4190-b7f5-b78b8d163c82" containerName="kserve-container" Apr 16 19:12:06.972249 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.972228 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:06.990520 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:06.990497 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:12:07.041819 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.041789 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rdlk4\" (UID: \"676ba6c0-8f8d-46a2-a780-3508a1520b10\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:07.142924 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.142888 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rdlk4\" (UID: \"676ba6c0-8f8d-46a2-a780-3508a1520b10\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:07.143292 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.143274 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location\") pod \"isvc-xgboost-predictor-6dbc9d6d47-rdlk4\" (UID: \"676ba6c0-8f8d-46a2-a780-3508a1520b10\") " pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:07.282155 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.282080 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:07.443105 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.443082 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:12:07.445671 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:12:07.445644 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod676ba6c0_8f8d_46a2_a780_3508a1520b10.slice/crio-919a691269eaf2d3b88c73c8ea91d7742be7ceb2dd5cdd14104c5c0a2c04792e WatchSource:0}: Error finding container 919a691269eaf2d3b88c73c8ea91d7742be7ceb2dd5cdd14104c5c0a2c04792e: Status 404 returned error can't find the container with id 919a691269eaf2d3b88c73c8ea91d7742be7ceb2dd5cdd14104c5c0a2c04792e Apr 16 19:12:07.704166 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.704135 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerStarted","Data":"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd"} Apr 16 19:12:07.704166 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:07.704172 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerStarted","Data":"919a691269eaf2d3b88c73c8ea91d7742be7ceb2dd5cdd14104c5c0a2c04792e"} Apr 16 19:12:09.711985 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:09.711954 2566 generic.go:358] "Generic (PLEG): container finished" podID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerID="245d9edcb0d4cdf01f89e1e1510cc836a12596c5fea8205872fc5a0988cfa3d0" exitCode=0 Apr 16 19:12:09.712323 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:09.712023 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerDied","Data":"245d9edcb0d4cdf01f89e1e1510cc836a12596c5fea8205872fc5a0988cfa3d0"} Apr 16 19:12:09.854681 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:09.854651 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:12:09.966951 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:09.966867 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location\") pod \"58a27a89-3dbd-4887-98e8-a30f0bbf9b39\" (UID: \"58a27a89-3dbd-4887-98e8-a30f0bbf9b39\") " Apr 16 19:12:09.967320 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:09.967298 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "58a27a89-3dbd-4887-98e8-a30f0bbf9b39" (UID: "58a27a89-3dbd-4887-98e8-a30f0bbf9b39"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:12:10.067982 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.067946 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58a27a89-3dbd-4887-98e8-a30f0bbf9b39-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:12:10.717216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.717127 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" event={"ID":"58a27a89-3dbd-4887-98e8-a30f0bbf9b39","Type":"ContainerDied","Data":"557b90be9403ebe707ec3de47e7b872b92c71b6d096056dfe03dece924749391"} Apr 16 19:12:10.717216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.717169 2566 scope.go:117] "RemoveContainer" containerID="245d9edcb0d4cdf01f89e1e1510cc836a12596c5fea8205872fc5a0988cfa3d0" Apr 16 19:12:10.717216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.717169 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct" Apr 16 19:12:10.724983 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.724967 2566 scope.go:117] "RemoveContainer" containerID="6f2bf6b360fa29ed0b0736af2fa9096f38c07ad2f52ff4b41d357d9c3583f4b1" Apr 16 19:12:10.737258 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.737236 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:12:10.744372 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:10.744347 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-triton-predictor-85f9f46646-ql5ct"] Apr 16 19:12:11.722869 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:11.722834 2566 generic.go:358] "Generic (PLEG): container finished" podID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerID="410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd" exitCode=0 Apr 16 19:12:11.723275 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:11.722914 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerDied","Data":"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd"} Apr 16 19:12:12.453320 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:12.453288 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" path="/var/lib/kubelet/pods/58a27a89-3dbd-4887-98e8-a30f0bbf9b39/volumes" Apr 16 19:12:32.793524 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:32.793486 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerStarted","Data":"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a"} Apr 16 19:12:32.794030 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:32.793825 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:12:32.794936 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:32.794911 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:12:32.811648 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:32.811598 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podStartSLOduration=6.580248613 podStartE2EDuration="26.811586019s" podCreationTimestamp="2026-04-16 19:12:06 +0000 UTC" firstStartedPulling="2026-04-16 19:12:11.72411034 +0000 UTC m=+3307.816398283" lastFinishedPulling="2026-04-16 19:12:31.955447747 +0000 UTC m=+3328.047735689" observedRunningTime="2026-04-16 19:12:32.811098191 +0000 UTC m=+3328.903386183" watchObservedRunningTime="2026-04-16 19:12:32.811586019 +0000 UTC m=+3328.903873985" Apr 16 19:12:33.797202 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:33.797161 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:12:43.797680 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:43.797632 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:12:53.797519 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:12:53.797469 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:13:03.797883 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:03.797830 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:13:13.797878 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:13.797830 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:13:23.798059 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:23.797989 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.58:8080: connect: connection refused" Apr 16 19:13:33.798981 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:33.798949 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:13:37.076051 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.076009 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:13:37.076470 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.076286 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" containerID="cri-o://488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a" gracePeriod=30 Apr 16 19:13:37.196051 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196013 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:13:37.196388 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196375 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="storage-initializer" Apr 16 19:13:37.196436 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196390 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="storage-initializer" Apr 16 19:13:37.196436 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196412 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" Apr 16 19:13:37.196436 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196421 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" Apr 16 19:13:37.196532 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.196480 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="58a27a89-3dbd-4887-98e8-a30f0bbf9b39" containerName="kserve-container" Apr 16 19:13:37.199388 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.199370 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:37.205446 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.205415 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z\" (UID: \"ac8b4de0-388b-402c-9dd3-2375a4f4a512\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:37.207700 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.207679 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:13:37.306098 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.306065 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z\" (UID: \"ac8b4de0-388b-402c-9dd3-2375a4f4a512\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:37.306443 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.306424 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location\") pod \"isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z\" (UID: \"ac8b4de0-388b-402c-9dd3-2375a4f4a512\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:37.510286 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.510255 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:37.636199 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.636175 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:13:37.638544 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:13:37.638511 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8b4de0_388b_402c_9dd3_2375a4f4a512.slice/crio-0d442e735fa533a55e77dae70b7d12125ece49b3e2e48a96efefc566a0e6ea7d WatchSource:0}: Error finding container 0d442e735fa533a55e77dae70b7d12125ece49b3e2e48a96efefc566a0e6ea7d: Status 404 returned error can't find the container with id 0d442e735fa533a55e77dae70b7d12125ece49b3e2e48a96efefc566a0e6ea7d Apr 16 19:13:37.640439 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.640424 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:13:37.987600 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.987564 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerStarted","Data":"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07"} Apr 16 19:13:37.987600 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:37.987603 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerStarted","Data":"0d442e735fa533a55e77dae70b7d12125ece49b3e2e48a96efefc566a0e6ea7d"} Apr 16 19:13:40.719766 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.719735 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:13:40.729654 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.729632 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location\") pod \"676ba6c0-8f8d-46a2-a780-3508a1520b10\" (UID: \"676ba6c0-8f8d-46a2-a780-3508a1520b10\") " Apr 16 19:13:40.729902 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.729882 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "676ba6c0-8f8d-46a2-a780-3508a1520b10" (UID: "676ba6c0-8f8d-46a2-a780-3508a1520b10"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:13:40.830197 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.830131 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/676ba6c0-8f8d-46a2-a780-3508a1520b10-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:13:40.998016 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.997976 2566 generic.go:358] "Generic (PLEG): container finished" podID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerID="488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a" exitCode=0 Apr 16 19:13:40.998151 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.998041 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerDied","Data":"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a"} Apr 16 19:13:40.998151 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.998058 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" Apr 16 19:13:40.998151 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.998069 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4" event={"ID":"676ba6c0-8f8d-46a2-a780-3508a1520b10","Type":"ContainerDied","Data":"919a691269eaf2d3b88c73c8ea91d7742be7ceb2dd5cdd14104c5c0a2c04792e"} Apr 16 19:13:40.998151 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:40.998084 2566 scope.go:117] "RemoveContainer" containerID="488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a" Apr 16 19:13:41.006177 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.006159 2566 scope.go:117] "RemoveContainer" containerID="410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd" Apr 16 19:13:41.013508 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.013488 2566 scope.go:117] "RemoveContainer" containerID="488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a" Apr 16 19:13:41.013796 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:13:41.013776 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a\": container with ID starting with 488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a not found: ID does not exist" containerID="488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a" Apr 16 19:13:41.013846 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.013807 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a"} err="failed to get container status \"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a\": rpc error: code = NotFound desc = could not find container \"488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a\": container with ID starting with 488dc6d1c03689eb0ea441dd98f40c4dd454c0bd35d1b87ee68edb84178d7f2a not found: ID does not exist" Apr 16 19:13:41.013846 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.013825 2566 scope.go:117] "RemoveContainer" containerID="410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd" Apr 16 19:13:41.014090 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:13:41.014068 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd\": container with ID starting with 410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd not found: ID does not exist" containerID="410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd" Apr 16 19:13:41.014143 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.014100 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd"} err="failed to get container status \"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd\": rpc error: code = NotFound desc = could not find container \"410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd\": container with ID starting with 410a979d3ba92c5608458bc1097bea034b2ddb907e30f1f79bdda0791b7376fd not found: ID does not exist" Apr 16 19:13:41.021282 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.021258 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:13:41.026234 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:41.026213 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-predictor-6dbc9d6d47-rdlk4"] Apr 16 19:13:42.002793 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:42.002763 2566 generic.go:358] "Generic (PLEG): container finished" podID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerID="043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07" exitCode=0 Apr 16 19:13:42.003240 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:42.002847 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerDied","Data":"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07"} Apr 16 19:13:42.453024 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:42.452969 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" path="/var/lib/kubelet/pods/676ba6c0-8f8d-46a2-a780-3508a1520b10/volumes" Apr 16 19:13:43.008529 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:43.008498 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerStarted","Data":"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4"} Apr 16 19:13:43.008949 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:43.008711 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:13:43.028945 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:13:43.028905 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" podStartSLOduration=6.028891957 podStartE2EDuration="6.028891957s" podCreationTimestamp="2026-04-16 19:13:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:13:43.026924008 +0000 UTC m=+3399.119211974" watchObservedRunningTime="2026-04-16 19:13:43.028891957 +0000 UTC m=+3399.121179921" Apr 16 19:14:14.054492 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:14.054458 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:14:17.497037 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.496984 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:14:17.497506 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.497486 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" Apr 16 19:14:17.497582 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.497510 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" Apr 16 19:14:17.497582 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.497541 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="storage-initializer" Apr 16 19:14:17.497582 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.497550 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="storage-initializer" Apr 16 19:14:17.497742 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.497646 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="676ba6c0-8f8d-46a2-a780-3508a1520b10" containerName="kserve-container" Apr 16 19:14:17.500554 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.500534 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:17.512455 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.512433 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:14:17.572671 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.572635 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:14:17.572925 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.572886 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="kserve-container" containerID="cri-o://4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4" gracePeriod=30 Apr 16 19:14:17.629922 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.629894 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4dnsw\" (UID: \"6be958c5-27f5-49ae-8410-f89e62b75f3f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:17.731347 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.731301 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4dnsw\" (UID: \"6be958c5-27f5-49ae-8410-f89e62b75f3f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:17.731634 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.731619 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location\") pod \"xgboost-v2-mlserver-predictor-c58d48f-4dnsw\" (UID: \"6be958c5-27f5-49ae-8410-f89e62b75f3f\") " pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:17.811088 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.811011 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:17.984704 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:17.984673 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:14:17.987783 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:14:17.987756 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be958c5_27f5_49ae_8410_f89e62b75f3f.slice/crio-92be62814d2941a1ea4f0056caed536096d77c79dfbcc8e792da49d2fec4d067 WatchSource:0}: Error finding container 92be62814d2941a1ea4f0056caed536096d77c79dfbcc8e792da49d2fec4d067: Status 404 returned error can't find the container with id 92be62814d2941a1ea4f0056caed536096d77c79dfbcc8e792da49d2fec4d067 Apr 16 19:14:18.112915 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:18.112878 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerStarted","Data":"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485"} Apr 16 19:14:18.113098 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:18.112922 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerStarted","Data":"92be62814d2941a1ea4f0056caed536096d77c79dfbcc8e792da49d2fec4d067"} Apr 16 19:14:22.130432 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:22.130396 2566 generic.go:358] "Generic (PLEG): container finished" podID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerID="7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485" exitCode=0 Apr 16 19:14:22.130942 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:22.130469 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerDied","Data":"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485"} Apr 16 19:14:23.134701 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:23.134664 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerStarted","Data":"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064"} Apr 16 19:14:23.135095 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:23.134885 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:23.156443 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:23.156391 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" podStartSLOduration=6.156370714 podStartE2EDuration="6.156370714s" podCreationTimestamp="2026-04-16 19:14:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:14:23.155508982 +0000 UTC m=+3439.247796948" watchObservedRunningTime="2026-04-16 19:14:23.156370714 +0000 UTC m=+3439.248658680" Apr 16 19:14:24.013154 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:24.013108 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.59:8080/v2/models/isvc-xgboost-v2-mlserver/ready\": dial tcp 10.132.0.59:8080: connect: connection refused" Apr 16 19:14:24.621927 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:24.621900 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:14:24.792503 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:24.792421 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location\") pod \"ac8b4de0-388b-402c-9dd3-2375a4f4a512\" (UID: \"ac8b4de0-388b-402c-9dd3-2375a4f4a512\") " Apr 16 19:14:24.792742 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:24.792717 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ac8b4de0-388b-402c-9dd3-2375a4f4a512" (UID: "ac8b4de0-388b-402c-9dd3-2375a4f4a512"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:14:24.893868 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:24.893838 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ac8b4de0-388b-402c-9dd3-2375a4f4a512-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:14:25.142203 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.142167 2566 generic.go:358] "Generic (PLEG): container finished" podID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerID="4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4" exitCode=0 Apr 16 19:14:25.142374 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.142237 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" Apr 16 19:14:25.142374 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.142257 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerDied","Data":"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4"} Apr 16 19:14:25.142374 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.142292 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z" event={"ID":"ac8b4de0-388b-402c-9dd3-2375a4f4a512","Type":"ContainerDied","Data":"0d442e735fa533a55e77dae70b7d12125ece49b3e2e48a96efefc566a0e6ea7d"} Apr 16 19:14:25.142374 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.142309 2566 scope.go:117] "RemoveContainer" containerID="4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4" Apr 16 19:14:25.150619 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.150602 2566 scope.go:117] "RemoveContainer" containerID="043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07" Apr 16 19:14:25.158297 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.158282 2566 scope.go:117] "RemoveContainer" containerID="4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4" Apr 16 19:14:25.158543 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:14:25.158525 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4\": container with ID starting with 4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4 not found: ID does not exist" containerID="4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4" Apr 16 19:14:25.158627 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.158554 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4"} err="failed to get container status \"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4\": rpc error: code = NotFound desc = could not find container \"4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4\": container with ID starting with 4b8b45d1b0b02c14669f7336895dcbc85651de6aa41f2222ffbee31a977c56b4 not found: ID does not exist" Apr 16 19:14:25.158627 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.158578 2566 scope.go:117] "RemoveContainer" containerID="043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07" Apr 16 19:14:25.158800 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:14:25.158784 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07\": container with ID starting with 043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07 not found: ID does not exist" containerID="043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07" Apr 16 19:14:25.158851 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.158807 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07"} err="failed to get container status \"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07\": rpc error: code = NotFound desc = could not find container \"043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07\": container with ID starting with 043b961f8a3631955be402b26e88a60dedd950b01adeff7f89706450c9c9ec07 not found: ID does not exist" Apr 16 19:14:25.169584 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.169562 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:14:25.174056 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:25.174032 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-mlserver-predictor-58b7db6668-wrq2z"] Apr 16 19:14:26.451981 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:26.451946 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" path="/var/lib/kubelet/pods/ac8b4de0-388b-402c-9dd3-2375a4f4a512/volumes" Apr 16 19:14:54.154029 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:54.153984 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:14:57.715779 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.715740 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:14:57.716173 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.716004 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="kserve-container" containerID="cri-o://74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064" gracePeriod=30 Apr 16 19:14:57.745904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.745874 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:14:57.746293 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.746278 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="kserve-container" Apr 16 19:14:57.746339 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.746295 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="kserve-container" Apr 16 19:14:57.746339 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.746304 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="storage-initializer" Apr 16 19:14:57.746339 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.746310 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="storage-initializer" Apr 16 19:14:57.746434 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.746361 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ac8b4de0-388b-402c-9dd3-2375a4f4a512" containerName="kserve-container" Apr 16 19:14:57.749475 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.749458 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:14:57.761155 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.761123 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:14:57.867925 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.867881 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dzntk\" (UID: \"6345602d-43e5-4f47-ad01-30f26e694d09\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:14:57.969116 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.969023 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dzntk\" (UID: \"6345602d-43e5-4f47-ad01-30f26e694d09\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:14:57.969403 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:57.969383 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location\") pod \"isvc-xgboost-runtime-predictor-687c7765c9-dzntk\" (UID: \"6345602d-43e5-4f47-ad01-30f26e694d09\") " pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:14:58.059980 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:58.059946 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:14:58.178649 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:58.178610 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:14:58.180887 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:14:58.180858 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6345602d_43e5_4f47_ad01_30f26e694d09.slice/crio-1f602b63d55f03425fecd44917f3801be65e8a3e0b29a6d6a6d9404803c28864 WatchSource:0}: Error finding container 1f602b63d55f03425fecd44917f3801be65e8a3e0b29a6d6a6d9404803c28864: Status 404 returned error can't find the container with id 1f602b63d55f03425fecd44917f3801be65e8a3e0b29a6d6a6d9404803c28864 Apr 16 19:14:58.244551 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:58.244523 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerStarted","Data":"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49"} Apr 16 19:14:58.244672 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:14:58.244558 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerStarted","Data":"1f602b63d55f03425fecd44917f3801be65e8a3e0b29a6d6a6d9404803c28864"} Apr 16 19:15:02.263936 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:02.263847 2566 generic.go:358] "Generic (PLEG): container finished" podID="6345602d-43e5-4f47-ad01-30f26e694d09" containerID="de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49" exitCode=0 Apr 16 19:15:02.263936 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:02.263918 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerDied","Data":"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49"} Apr 16 19:15:03.269571 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:03.269535 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerStarted","Data":"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529"} Apr 16 19:15:03.269963 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:03.269837 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:15:03.271181 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:03.271154 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:03.292330 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:03.292290 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podStartSLOduration=6.292278483 podStartE2EDuration="6.292278483s" podCreationTimestamp="2026-04-16 19:14:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:15:03.29106559 +0000 UTC m=+3479.383353555" watchObservedRunningTime="2026-04-16 19:15:03.292278483 +0000 UTC m=+3479.384566448" Apr 16 19:15:04.139197 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:04.139156 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.60:8080/v2/models/xgboost-v2-mlserver/ready\": dial tcp 10.132.0.60:8080: connect: connection refused" Apr 16 19:15:04.273968 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:04.273929 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:05.583836 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:05.583815 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:15:05.735664 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:05.735577 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location\") pod \"6be958c5-27f5-49ae-8410-f89e62b75f3f\" (UID: \"6be958c5-27f5-49ae-8410-f89e62b75f3f\") " Apr 16 19:15:05.735901 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:05.735877 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6be958c5-27f5-49ae-8410-f89e62b75f3f" (UID: "6be958c5-27f5-49ae-8410-f89e62b75f3f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:15:05.836462 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:05.836434 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6be958c5-27f5-49ae-8410-f89e62b75f3f-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:15:06.280260 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.280223 2566 generic.go:358] "Generic (PLEG): container finished" podID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerID="74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064" exitCode=0 Apr 16 19:15:06.280429 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.280283 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" Apr 16 19:15:06.280429 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.280307 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerDied","Data":"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064"} Apr 16 19:15:06.280429 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.280345 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw" event={"ID":"6be958c5-27f5-49ae-8410-f89e62b75f3f","Type":"ContainerDied","Data":"92be62814d2941a1ea4f0056caed536096d77c79dfbcc8e792da49d2fec4d067"} Apr 16 19:15:06.280429 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.280360 2566 scope.go:117] "RemoveContainer" containerID="74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064" Apr 16 19:15:06.288488 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.288468 2566 scope.go:117] "RemoveContainer" containerID="7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485" Apr 16 19:15:06.295598 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.295583 2566 scope.go:117] "RemoveContainer" containerID="74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064" Apr 16 19:15:06.295827 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:15:06.295809 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064\": container with ID starting with 74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064 not found: ID does not exist" containerID="74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064" Apr 16 19:15:06.295878 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.295835 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064"} err="failed to get container status \"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064\": rpc error: code = NotFound desc = could not find container \"74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064\": container with ID starting with 74b0ae066f8b81112276bcaff8eabf0686233e4e6dae48f77c6aa1d5b6a9a064 not found: ID does not exist" Apr 16 19:15:06.295878 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.295851 2566 scope.go:117] "RemoveContainer" containerID="7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485" Apr 16 19:15:06.296074 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:15:06.296055 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485\": container with ID starting with 7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485 not found: ID does not exist" containerID="7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485" Apr 16 19:15:06.296134 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.296084 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485"} err="failed to get container status \"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485\": rpc error: code = NotFound desc = could not find container \"7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485\": container with ID starting with 7c9ff72de00afab0d29c9d0a68cca5c05fc6b55dcaaeef53938f8f285a895485 not found: ID does not exist" Apr 16 19:15:06.305730 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.305700 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:15:06.307256 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.307234 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/xgboost-v2-mlserver-predictor-c58d48f-4dnsw"] Apr 16 19:15:06.452618 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:06.452575 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" path="/var/lib/kubelet/pods/6be958c5-27f5-49ae-8410-f89e62b75f3f/volumes" Apr 16 19:15:14.274452 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:14.274413 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:24.274446 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:24.274364 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:34.274148 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:34.274104 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:44.274418 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:44.274378 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:15:54.274808 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:15:54.274758 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.61:8080: connect: connection refused" Apr 16 19:16:04.275178 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:04.275151 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:16:07.822889 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.822854 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:16:07.823278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.823119 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" containerID="cri-o://8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529" gracePeriod=30 Apr 16 19:16:07.913216 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913184 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:16:07.913525 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913513 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="kserve-container" Apr 16 19:16:07.913575 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913527 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="kserve-container" Apr 16 19:16:07.913575 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913537 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="storage-initializer" Apr 16 19:16:07.913575 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913543 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="storage-initializer" Apr 16 19:16:07.913685 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.913607 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6be958c5-27f5-49ae-8410-f89e62b75f3f" containerName="kserve-container" Apr 16 19:16:07.918100 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.918082 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:07.926237 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:07.926216 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:16:08.059201 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.059162 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj\" (UID: \"25f3cc11-71eb-4027-be32-e08da9781cf1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:08.160125 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.160090 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj\" (UID: \"25f3cc11-71eb-4027-be32-e08da9781cf1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:08.160569 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.160543 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location\") pod \"isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj\" (UID: \"25f3cc11-71eb-4027-be32-e08da9781cf1\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:08.228316 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.228285 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:08.353401 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.348899 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:16:08.355100 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:16:08.355062 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f3cc11_71eb_4027_be32_e08da9781cf1.slice/crio-04ac92194588fde18ee82f4d80da59836810c32accb37556c156aeef24f2d9b3 WatchSource:0}: Error finding container 04ac92194588fde18ee82f4d80da59836810c32accb37556c156aeef24f2d9b3: Status 404 returned error can't find the container with id 04ac92194588fde18ee82f4d80da59836810c32accb37556c156aeef24f2d9b3 Apr 16 19:16:08.471152 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.471071 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerStarted","Data":"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8"} Apr 16 19:16:08.471152 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:08.471115 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerStarted","Data":"04ac92194588fde18ee82f4d80da59836810c32accb37556c156aeef24f2d9b3"} Apr 16 19:16:11.366444 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.366418 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:16:11.481705 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.481616 2566 generic.go:358] "Generic (PLEG): container finished" podID="6345602d-43e5-4f47-ad01-30f26e694d09" containerID="8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529" exitCode=0 Apr 16 19:16:11.481705 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.481678 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" Apr 16 19:16:11.481705 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.481696 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerDied","Data":"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529"} Apr 16 19:16:11.481916 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.481732 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk" event={"ID":"6345602d-43e5-4f47-ad01-30f26e694d09","Type":"ContainerDied","Data":"1f602b63d55f03425fecd44917f3801be65e8a3e0b29a6d6a6d9404803c28864"} Apr 16 19:16:11.481916 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.481745 2566 scope.go:117] "RemoveContainer" containerID="8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529" Apr 16 19:16:11.487035 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.487015 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location\") pod \"6345602d-43e5-4f47-ad01-30f26e694d09\" (UID: \"6345602d-43e5-4f47-ad01-30f26e694d09\") " Apr 16 19:16:11.487385 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.487360 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6345602d-43e5-4f47-ad01-30f26e694d09" (UID: "6345602d-43e5-4f47-ad01-30f26e694d09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:16:11.489494 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.489478 2566 scope.go:117] "RemoveContainer" containerID="de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49" Apr 16 19:16:11.496279 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.496264 2566 scope.go:117] "RemoveContainer" containerID="8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529" Apr 16 19:16:11.496530 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:16:11.496511 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529\": container with ID starting with 8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529 not found: ID does not exist" containerID="8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529" Apr 16 19:16:11.496580 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.496540 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529"} err="failed to get container status \"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529\": rpc error: code = NotFound desc = could not find container \"8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529\": container with ID starting with 8880fecc4b307b72ce8300ae9f2f2ee0408f01cc22835b6395bc2e3ec3a63529 not found: ID does not exist" Apr 16 19:16:11.496580 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.496557 2566 scope.go:117] "RemoveContainer" containerID="de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49" Apr 16 19:16:11.496782 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:16:11.496764 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49\": container with ID starting with de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49 not found: ID does not exist" containerID="de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49" Apr 16 19:16:11.496831 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.496789 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49"} err="failed to get container status \"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49\": rpc error: code = NotFound desc = could not find container \"de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49\": container with ID starting with de08e343b61944838d8f1e8c3708f16fa5238ccdcdb28c32a1876569e4c82c49 not found: ID does not exist" Apr 16 19:16:11.588237 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.588212 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6345602d-43e5-4f47-ad01-30f26e694d09-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:16:11.806075 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.806048 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:16:11.811913 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:11.811886 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-runtime-predictor-687c7765c9-dzntk"] Apr 16 19:16:12.452926 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:12.452841 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" path="/var/lib/kubelet/pods/6345602d-43e5-4f47-ad01-30f26e694d09/volumes" Apr 16 19:16:12.486359 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:12.486332 2566 generic.go:358] "Generic (PLEG): container finished" podID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerID="6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8" exitCode=0 Apr 16 19:16:12.486502 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:12.486400 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerDied","Data":"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8"} Apr 16 19:16:13.490572 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:13.490540 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerStarted","Data":"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3"} Apr 16 19:16:13.490955 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:13.490740 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:13.510180 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:13.510131 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" podStartSLOduration=6.510118973 podStartE2EDuration="6.510118973s" podCreationTimestamp="2026-04-16 19:16:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:16:13.507837104 +0000 UTC m=+3549.600125069" watchObservedRunningTime="2026-04-16 19:16:13.510118973 +0000 UTC m=+3549.602406938" Apr 16 19:16:44.554360 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:44.554309 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" probeResult="failure" output="HTTP probe failed with statuscode: 400" Apr 16 19:16:54.496253 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:54.496177 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:16:57.979892 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:57.979858 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:16:57.980354 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:57.980219 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" containerID="cri-o://3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3" gracePeriod=30 Apr 16 19:16:58.031812 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.031776 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:16:58.032234 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.032217 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="storage-initializer" Apr 16 19:16:58.032336 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.032236 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="storage-initializer" Apr 16 19:16:58.032336 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.032260 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" Apr 16 19:16:58.032336 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.032269 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" Apr 16 19:16:58.032504 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.032348 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="6345602d-43e5-4f47-ad01-30f26e694d09" containerName="kserve-container" Apr 16 19:16:58.035484 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.035464 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:16:58.042725 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.042682 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:16:58.169191 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.169155 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-gg5ns\" (UID: \"3ef96a2b-4675-45ef-b733-4052094a41aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:16:58.270629 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.270543 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-gg5ns\" (UID: \"3ef96a2b-4675-45ef-b733-4052094a41aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:16:58.271018 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.270975 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location\") pod \"isvc-xgboost-v2-predictor-5db5686f9f-gg5ns\" (UID: \"3ef96a2b-4675-45ef-b733-4052094a41aa\") " pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:16:58.346809 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.346777 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:16:58.468908 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.468875 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:16:58.472852 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:16:58.472816 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ef96a2b_4675_45ef_b733_4052094a41aa.slice/crio-7722c483a0de412b50d001f262ffd1a8cf0a1243ab3d68974dc5d31c52f266d2 WatchSource:0}: Error finding container 7722c483a0de412b50d001f262ffd1a8cf0a1243ab3d68974dc5d31c52f266d2: Status 404 returned error can't find the container with id 7722c483a0de412b50d001f262ffd1a8cf0a1243ab3d68974dc5d31c52f266d2 Apr 16 19:16:58.622563 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.622529 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerStarted","Data":"389a51dbebf34e6da514f7eec49997448d90aac84f1f1df3007fa66071f5c3d2"} Apr 16 19:16:58.622707 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:16:58.622569 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerStarted","Data":"7722c483a0de412b50d001f262ffd1a8cf0a1243ab3d68974dc5d31c52f266d2"} Apr 16 19:17:02.638590 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:02.638550 2566 generic.go:358] "Generic (PLEG): container finished" podID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerID="389a51dbebf34e6da514f7eec49997448d90aac84f1f1df3007fa66071f5c3d2" exitCode=0 Apr 16 19:17:02.639072 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:02.638619 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerDied","Data":"389a51dbebf34e6da514f7eec49997448d90aac84f1f1df3007fa66071f5c3d2"} Apr 16 19:17:03.643751 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:03.643716 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerStarted","Data":"ad8a2f48c178872efc97ec022e64d72b4122f48adc1334051ef4a55072a08a8e"} Apr 16 19:17:03.644198 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:03.644022 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:17:03.645423 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:03.645395 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:03.662579 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:03.662535 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podStartSLOduration=5.662521434 podStartE2EDuration="5.662521434s" podCreationTimestamp="2026-04-16 19:16:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:17:03.66056995 +0000 UTC m=+3599.752857915" watchObservedRunningTime="2026-04-16 19:17:03.662521434 +0000 UTC m=+3599.754809377" Apr 16 19:17:04.494351 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:04.494306 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" probeResult="failure" output="Get \"http://10.132.0.62:8080/v2/models/isvc-xgboost-v2-runtime/ready\": dial tcp 10.132.0.62:8080: connect: connection refused" Apr 16 19:17:04.647309 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:04.647269 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:05.551888 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.551865 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:17:05.651425 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.651344 2566 generic.go:358] "Generic (PLEG): container finished" podID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerID="3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3" exitCode=0 Apr 16 19:17:05.651425 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.651418 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" Apr 16 19:17:05.651835 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.651424 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerDied","Data":"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3"} Apr 16 19:17:05.651835 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.651517 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj" event={"ID":"25f3cc11-71eb-4027-be32-e08da9781cf1","Type":"ContainerDied","Data":"04ac92194588fde18ee82f4d80da59836810c32accb37556c156aeef24f2d9b3"} Apr 16 19:17:05.651835 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.651533 2566 scope.go:117] "RemoveContainer" containerID="3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3" Apr 16 19:17:05.659223 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.659208 2566 scope.go:117] "RemoveContainer" containerID="6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8" Apr 16 19:17:05.665841 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.665825 2566 scope.go:117] "RemoveContainer" containerID="3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3" Apr 16 19:17:05.666176 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:17:05.666158 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3\": container with ID starting with 3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3 not found: ID does not exist" containerID="3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3" Apr 16 19:17:05.666248 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.666186 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3"} err="failed to get container status \"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3\": rpc error: code = NotFound desc = could not find container \"3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3\": container with ID starting with 3ecbb6def42fd514f0b5da9b75c4da505be4eb4d2307b1be3a5922404856d4e3 not found: ID does not exist" Apr 16 19:17:05.666248 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.666206 2566 scope.go:117] "RemoveContainer" containerID="6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8" Apr 16 19:17:05.666439 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:17:05.666424 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8\": container with ID starting with 6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8 not found: ID does not exist" containerID="6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8" Apr 16 19:17:05.666478 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.666453 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8"} err="failed to get container status \"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8\": rpc error: code = NotFound desc = could not find container \"6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8\": container with ID starting with 6cecd9408c83bb637919e314ab0133eaa85102b3d5d14e1eaf2474b868de05a8 not found: ID does not exist" Apr 16 19:17:05.731498 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.731446 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location\") pod \"25f3cc11-71eb-4027-be32-e08da9781cf1\" (UID: \"25f3cc11-71eb-4027-be32-e08da9781cf1\") " Apr 16 19:17:05.731764 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.731745 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25f3cc11-71eb-4027-be32-e08da9781cf1" (UID: "25f3cc11-71eb-4027-be32-e08da9781cf1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:17:05.832516 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.832482 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f3cc11-71eb-4027-be32-e08da9781cf1-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:17:05.975022 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.974978 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:17:05.979080 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:05.979053 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-runtime-predictor-b5d6966c7-f9jcj"] Apr 16 19:17:06.452080 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:06.452046 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" path="/var/lib/kubelet/pods/25f3cc11-71eb-4027-be32-e08da9781cf1/volumes" Apr 16 19:17:14.647714 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:14.647676 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:24.647514 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:24.647475 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:34.647983 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:34.647938 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:44.647946 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:44.647891 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:17:54.647443 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:17:54.647394 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.63:8080: connect: connection refused" Apr 16 19:18:04.647715 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:04.647682 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:18:08.179300 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.179266 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:18:08.179749 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.179611 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" containerID="cri-o://ad8a2f48c178872efc97ec022e64d72b4122f48adc1334051ef4a55072a08a8e" gracePeriod=30 Apr 16 19:18:08.240412 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240382 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:18:08.240727 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240715 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="storage-initializer" Apr 16 19:18:08.240777 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240729 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="storage-initializer" Apr 16 19:18:08.240777 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240740 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" Apr 16 19:18:08.240777 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240745 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" Apr 16 19:18:08.240887 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.240801 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="25f3cc11-71eb-4027-be32-e08da9781cf1" containerName="kserve-container" Apr 16 19:18:08.244141 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.244116 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:08.246551 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.246529 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:18:08.254637 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.254613 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:18:08.367108 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.367077 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-695b7cc5c-lk2gm\" (UID: \"8d4b1934-78c1-4cf9-81f2-0d210b4858aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:08.467682 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.467604 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-695b7cc5c-lk2gm\" (UID: \"8d4b1934-78c1-4cf9-81f2-0d210b4858aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:08.468041 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.468022 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location\") pod \"isvc-sklearn-s3-predictor-695b7cc5c-lk2gm\" (UID: \"8d4b1934-78c1-4cf9-81f2-0d210b4858aa\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:08.555528 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.555498 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:08.676741 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.676714 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:18:08.680285 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:18:08.680257 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d4b1934_78c1_4cf9_81f2_0d210b4858aa.slice/crio-947b85c84864aac6edae374e0cbe7036569325ab8c3988c6e9123de94cf5e025 WatchSource:0}: Error finding container 947b85c84864aac6edae374e0cbe7036569325ab8c3988c6e9123de94cf5e025: Status 404 returned error can't find the container with id 947b85c84864aac6edae374e0cbe7036569325ab8c3988c6e9123de94cf5e025 Apr 16 19:18:08.840975 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.840935 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerStarted","Data":"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0"} Apr 16 19:18:08.840975 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:08.840979 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerStarted","Data":"947b85c84864aac6edae374e0cbe7036569325ab8c3988c6e9123de94cf5e025"} Apr 16 19:18:09.844859 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:09.844821 2566 generic.go:358] "Generic (PLEG): container finished" podID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerID="056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0" exitCode=0 Apr 16 19:18:09.845263 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:09.844913 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerDied","Data":"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0"} Apr 16 19:18:10.850099 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:10.850062 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerStarted","Data":"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d"} Apr 16 19:18:10.850527 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:10.850297 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:18:10.851574 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:10.851545 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:18:10.871323 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:10.871272 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podStartSLOduration=2.871257132 podStartE2EDuration="2.871257132s" podCreationTimestamp="2026-04-16 19:18:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:18:10.869815604 +0000 UTC m=+3666.962103592" watchObservedRunningTime="2026-04-16 19:18:10.871257132 +0000 UTC m=+3666.963545098" Apr 16 19:18:11.855460 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:11.855420 2566 generic.go:358] "Generic (PLEG): container finished" podID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerID="ad8a2f48c178872efc97ec022e64d72b4122f48adc1334051ef4a55072a08a8e" exitCode=0 Apr 16 19:18:11.855844 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:11.855492 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerDied","Data":"ad8a2f48c178872efc97ec022e64d72b4122f48adc1334051ef4a55072a08a8e"} Apr 16 19:18:11.856192 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:11.856153 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:18:11.967585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:11.967563 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:18:12.000599 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.000570 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location\") pod \"3ef96a2b-4675-45ef-b733-4052094a41aa\" (UID: \"3ef96a2b-4675-45ef-b733-4052094a41aa\") " Apr 16 19:18:12.000888 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.000865 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ef96a2b-4675-45ef-b733-4052094a41aa" (UID: "3ef96a2b-4675-45ef-b733-4052094a41aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:18:12.101724 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.101696 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ef96a2b-4675-45ef-b733-4052094a41aa-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:18:12.860202 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.860173 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" event={"ID":"3ef96a2b-4675-45ef-b733-4052094a41aa","Type":"ContainerDied","Data":"7722c483a0de412b50d001f262ffd1a8cf0a1243ab3d68974dc5d31c52f266d2"} Apr 16 19:18:12.860667 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.860221 2566 scope.go:117] "RemoveContainer" containerID="ad8a2f48c178872efc97ec022e64d72b4122f48adc1334051ef4a55072a08a8e" Apr 16 19:18:12.860667 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.860224 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns" Apr 16 19:18:12.867983 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.867961 2566 scope.go:117] "RemoveContainer" containerID="389a51dbebf34e6da514f7eec49997448d90aac84f1f1df3007fa66071f5c3d2" Apr 16 19:18:12.878349 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.878309 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:18:12.880093 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:12.880075 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-v2-predictor-5db5686f9f-gg5ns"] Apr 16 19:18:14.452285 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:14.452254 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" path="/var/lib/kubelet/pods/3ef96a2b-4675-45ef-b733-4052094a41aa/volumes" Apr 16 19:18:21.856381 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:21.856334 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:18:31.856228 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:31.856141 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:18:41.856731 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:41.856688 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:18:51.856728 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:18:51.856682 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:19:01.856188 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:01.856139 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:19:11.856487 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:11.856430 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.64:8080: connect: connection refused" Apr 16 19:19:16.452122 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:16.452095 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:19:18.371632 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.371598 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:19:18.372030 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.371858 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" containerID="cri-o://b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d" gracePeriod=30 Apr 16 19:19:18.467681 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.467647 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:19:18.468010 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.467983 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="storage-initializer" Apr 16 19:19:18.468074 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.468011 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="storage-initializer" Apr 16 19:19:18.468074 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.468027 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" Apr 16 19:19:18.468074 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.468033 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" Apr 16 19:19:18.468176 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.468083 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ef96a2b-4675-45ef-b733-4052094a41aa" containerName="kserve-container" Apr 16 19:19:18.472268 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.472250 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.475107 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.475087 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:19:18.482088 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.482067 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:19:18.561184 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.561140 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.561386 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.561188 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.662273 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.662188 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.662273 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.662227 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.662578 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.662558 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.662798 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.662782 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert\") pod \"isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.783721 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.783693 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:18.904831 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.904808 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:19:18.907369 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:19:18.907332 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2682b351_e998_473e_9295_a03bcc780b9e.slice/crio-5e947a48218f0ceee91767f8b7de50ded86a14acc22bce2e3f3940cd8f914bf9 WatchSource:0}: Error finding container 5e947a48218f0ceee91767f8b7de50ded86a14acc22bce2e3f3940cd8f914bf9: Status 404 returned error can't find the container with id 5e947a48218f0ceee91767f8b7de50ded86a14acc22bce2e3f3940cd8f914bf9 Apr 16 19:19:18.909626 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:18.909606 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:19:19.063445 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:19.063409 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerStarted","Data":"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e"} Apr 16 19:19:19.063445 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:19.063445 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerStarted","Data":"5e947a48218f0ceee91767f8b7de50ded86a14acc22bce2e3f3940cd8f914bf9"} Apr 16 19:19:20.067506 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:20.067467 2566 generic.go:358] "Generic (PLEG): container finished" podID="2682b351-e998-473e-9295-a03bcc780b9e" containerID="89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e" exitCode=0 Apr 16 19:19:20.067959 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:20.067556 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerDied","Data":"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e"} Apr 16 19:19:21.072076 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:21.072039 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerStarted","Data":"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844"} Apr 16 19:19:21.072496 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:21.072239 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:19:21.073662 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:21.073632 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:19:21.097792 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:21.097752 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podStartSLOduration=3.097740022 podStartE2EDuration="3.097740022s" podCreationTimestamp="2026-04-16 19:19:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:19:21.095203279 +0000 UTC m=+3737.187491244" watchObservedRunningTime="2026-04-16 19:19:21.097740022 +0000 UTC m=+3737.190027986" Apr 16 19:19:22.075897 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:22.075862 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:19:22.515081 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:22.515056 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:19:22.597739 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:22.597708 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location\") pod \"8d4b1934-78c1-4cf9-81f2-0d210b4858aa\" (UID: \"8d4b1934-78c1-4cf9-81f2-0d210b4858aa\") " Apr 16 19:19:22.598027 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:22.597984 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8d4b1934-78c1-4cf9-81f2-0d210b4858aa" (UID: "8d4b1934-78c1-4cf9-81f2-0d210b4858aa"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:19:22.699217 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:22.699147 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8d4b1934-78c1-4cf9-81f2-0d210b4858aa-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:19:23.084721 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.084633 2566 generic.go:358] "Generic (PLEG): container finished" podID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerID="b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d" exitCode=0 Apr 16 19:19:23.085110 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.084723 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" Apr 16 19:19:23.085110 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.084715 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerDied","Data":"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d"} Apr 16 19:19:23.085110 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.084834 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm" event={"ID":"8d4b1934-78c1-4cf9-81f2-0d210b4858aa","Type":"ContainerDied","Data":"947b85c84864aac6edae374e0cbe7036569325ab8c3988c6e9123de94cf5e025"} Apr 16 19:19:23.085110 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.084854 2566 scope.go:117] "RemoveContainer" containerID="b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d" Apr 16 19:19:23.092921 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.092899 2566 scope.go:117] "RemoveContainer" containerID="056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0" Apr 16 19:19:23.099589 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.099575 2566 scope.go:117] "RemoveContainer" containerID="b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d" Apr 16 19:19:23.099804 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:19:23.099788 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d\": container with ID starting with b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d not found: ID does not exist" containerID="b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d" Apr 16 19:19:23.099843 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.099813 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d"} err="failed to get container status \"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d\": rpc error: code = NotFound desc = could not find container \"b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d\": container with ID starting with b33ad30055016b8933f1fd7ee35b5cc857fb302a4c91e2a26f7e332302ea7e6d not found: ID does not exist" Apr 16 19:19:23.099843 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.099828 2566 scope.go:117] "RemoveContainer" containerID="056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0" Apr 16 19:19:23.100090 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:19:23.100071 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0\": container with ID starting with 056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0 not found: ID does not exist" containerID="056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0" Apr 16 19:19:23.100137 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.100096 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0"} err="failed to get container status \"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0\": rpc error: code = NotFound desc = could not find container \"056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0\": container with ID starting with 056780160de547480b234d2dc4133f27edf04437a1546fa6be970c64ccbc7fa0 not found: ID does not exist" Apr 16 19:19:23.112052 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.112028 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:19:23.119966 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:23.119947 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-predictor-695b7cc5c-lk2gm"] Apr 16 19:19:24.452132 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:24.452100 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" path="/var/lib/kubelet/pods/8d4b1934-78c1-4cf9-81f2-0d210b4858aa/volumes" Apr 16 19:19:32.076778 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:32.076734 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:19:42.076062 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:42.076018 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:19:52.076129 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:19:52.076080 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:20:02.076674 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:02.076583 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:20:12.076633 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:12.076590 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:20:22.075853 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:22.075810 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:20:23.448095 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:23.448045 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.65:8080: connect: connection refused" Apr 16 19:20:33.449226 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:33.449190 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:20:38.538019 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:38.537965 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:20:38.538486 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:38.538317 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" containerID="cri-o://068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844" gracePeriod=30 Apr 16 19:20:39.611533 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611503 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:39.611904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611823 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" Apr 16 19:20:39.611904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611834 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" Apr 16 19:20:39.611904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611845 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="storage-initializer" Apr 16 19:20:39.611904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611850 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="storage-initializer" Apr 16 19:20:39.612072 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.611914 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d4b1934-78c1-4cf9-81f2-0d210b4858aa" containerName="kserve-container" Apr 16 19:20:39.614731 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.614710 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:39.624520 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.624497 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:39.745440 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.745409 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc\" (UID: \"cfefdebd-253d-4787-a22d-898c29bc379d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:39.846820 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.846783 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc\" (UID: \"cfefdebd-253d-4787-a22d-898c29bc379d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:39.847176 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.847154 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc\" (UID: \"cfefdebd-253d-4787-a22d-898c29bc379d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:39.925953 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:39.925863 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:40.045155 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:40.045131 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:40.047338 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:20:40.047303 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfefdebd_253d_4787_a22d_898c29bc379d.slice/crio-f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc WatchSource:0}: Error finding container f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc: Status 404 returned error can't find the container with id f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc Apr 16 19:20:40.322925 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:40.322840 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerStarted","Data":"c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41"} Apr 16 19:20:40.322925 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:40.322886 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerStarted","Data":"f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc"} Apr 16 19:20:42.330572 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.330492 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/0.log" Apr 16 19:20:42.330572 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.330531 2566 generic.go:358] "Generic (PLEG): container finished" podID="cfefdebd-253d-4787-a22d-898c29bc379d" containerID="c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41" exitCode=1 Apr 16 19:20:42.330980 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.330606 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerDied","Data":"c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41"} Apr 16 19:20:42.786256 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.786228 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:20:42.873859 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.873824 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location\") pod \"2682b351-e998-473e-9295-a03bcc780b9e\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " Apr 16 19:20:42.874060 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.873926 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert\") pod \"2682b351-e998-473e-9295-a03bcc780b9e\" (UID: \"2682b351-e998-473e-9295-a03bcc780b9e\") " Apr 16 19:20:42.874265 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.874169 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2682b351-e998-473e-9295-a03bcc780b9e" (UID: "2682b351-e998-473e-9295-a03bcc780b9e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:42.874334 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.874269 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "2682b351-e998-473e-9295-a03bcc780b9e" (UID: "2682b351-e998-473e-9295-a03bcc780b9e"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:20:42.975352 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.975264 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2682b351-e998-473e-9295-a03bcc780b9e-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:20:42.975352 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:42.975298 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/2682b351-e998-473e-9295-a03bcc780b9e-cabundle-cert\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:20:43.336094 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.336024 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/0.log" Apr 16 19:20:43.336507 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.336099 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerStarted","Data":"7dc184cd29f1013d31f86070d6e26f267c48f436f95c492548016f2e4a376d9d"} Apr 16 19:20:43.337563 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.337535 2566 generic.go:358] "Generic (PLEG): container finished" podID="2682b351-e998-473e-9295-a03bcc780b9e" containerID="068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844" exitCode=0 Apr 16 19:20:43.337673 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.337592 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" Apr 16 19:20:43.337673 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.337604 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerDied","Data":"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844"} Apr 16 19:20:43.337673 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.337627 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld" event={"ID":"2682b351-e998-473e-9295-a03bcc780b9e","Type":"ContainerDied","Data":"5e947a48218f0ceee91767f8b7de50ded86a14acc22bce2e3f3940cd8f914bf9"} Apr 16 19:20:43.337673 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.337642 2566 scope.go:117] "RemoveContainer" containerID="068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844" Apr 16 19:20:43.345872 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.345858 2566 scope.go:117] "RemoveContainer" containerID="89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e" Apr 16 19:20:43.352670 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.352655 2566 scope.go:117] "RemoveContainer" containerID="068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844" Apr 16 19:20:43.352878 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:20:43.352862 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844\": container with ID starting with 068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844 not found: ID does not exist" containerID="068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844" Apr 16 19:20:43.352918 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.352886 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844"} err="failed to get container status \"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844\": rpc error: code = NotFound desc = could not find container \"068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844\": container with ID starting with 068c4f7967b0f18bebb806b8bef4b7c4df1cb7b31843ab10033e06deb36fb844 not found: ID does not exist" Apr 16 19:20:43.352918 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.352907 2566 scope.go:117] "RemoveContainer" containerID="89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e" Apr 16 19:20:43.353128 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:20:43.353109 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e\": container with ID starting with 89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e not found: ID does not exist" containerID="89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e" Apr 16 19:20:43.353174 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.353133 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e"} err="failed to get container status \"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e\": rpc error: code = NotFound desc = could not find container \"89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e\": container with ID starting with 89c26d9542a752679df00ae7191ee34567dd0fcf30f39b5e446a7975651cb59e not found: ID does not exist" Apr 16 19:20:43.371651 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.371623 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:20:43.376274 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:43.376254 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-pass-predictor-5484987b89-h4jld"] Apr 16 19:20:44.452663 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:44.452632 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2682b351-e998-473e-9295-a03bcc780b9e" path="/var/lib/kubelet/pods/2682b351-e998-473e-9295-a03bcc780b9e/volumes" Apr 16 19:20:45.346401 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.346375 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/1.log" Apr 16 19:20:45.346730 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.346715 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/0.log" Apr 16 19:20:45.346783 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.346746 2566 generic.go:358] "Generic (PLEG): container finished" podID="cfefdebd-253d-4787-a22d-898c29bc379d" containerID="7dc184cd29f1013d31f86070d6e26f267c48f436f95c492548016f2e4a376d9d" exitCode=1 Apr 16 19:20:45.346846 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.346825 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerDied","Data":"7dc184cd29f1013d31f86070d6e26f267c48f436f95c492548016f2e4a376d9d"} Apr 16 19:20:45.346884 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.346872 2566 scope.go:117] "RemoveContainer" containerID="c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41" Apr 16 19:20:45.347258 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.347243 2566 scope.go:117] "RemoveContainer" containerID="c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41" Apr 16 19:20:45.357650 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:20:45.357622 2566 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_kserve-ci-e2e-test_cfefdebd-253d-4787-a22d-898c29bc379d_0 in pod sandbox f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc from index: no such id: 'c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41'" containerID="c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41" Apr 16 19:20:45.357717 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:45.357660 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_kserve-ci-e2e-test_cfefdebd-253d-4787-a22d-898c29bc379d_0 in pod sandbox f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc from index: no such id: 'c0a41e358a533b45c242484d4d6d2595e02235d2a218707d687666f55871db41'" Apr 16 19:20:45.357854 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:20:45.357836 2566 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_kserve-ci-e2e-test(cfefdebd-253d-4787-a22d-898c29bc379d)\"" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" Apr 16 19:20:46.351917 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:46.351891 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/1.log" Apr 16 19:20:49.626052 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.626022 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:49.753093 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.753066 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/1.log" Apr 16 19:20:49.753213 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.753133 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:49.836736 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.836701 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location\") pod \"cfefdebd-253d-4787-a22d-898c29bc379d\" (UID: \"cfefdebd-253d-4787-a22d-898c29bc379d\") " Apr 16 19:20:49.837008 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.836966 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cfefdebd-253d-4787-a22d-898c29bc379d" (UID: "cfefdebd-253d-4787-a22d-898c29bc379d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:20:49.938085 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:49.937967 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cfefdebd-253d-4787-a22d-898c29bc379d-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:20:50.372157 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.372131 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc_cfefdebd-253d-4787-a22d-898c29bc379d/storage-initializer/1.log" Apr 16 19:20:50.372328 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.372204 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" event={"ID":"cfefdebd-253d-4787-a22d-898c29bc379d","Type":"ContainerDied","Data":"f8d666187217def53e4e5baf1db03d9d68e10ff0f06288a3a2efbf321aa8b8cc"} Apr 16 19:20:50.372328 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.372233 2566 scope.go:117] "RemoveContainer" containerID="7dc184cd29f1013d31f86070d6e26f267c48f436f95c492548016f2e4a376d9d" Apr 16 19:20:50.372328 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.372239 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc" Apr 16 19:20:50.411861 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.411836 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:50.416279 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.416257 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-global-fail-predictor-754959c6d7-sb2dc"] Apr 16 19:20:50.452922 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.452895 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" path="/var/lib/kubelet/pods/cfefdebd-253d-4787-a22d-898c29bc379d/volumes" Apr 16 19:20:50.683551 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683444 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683908 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683928 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683955 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="storage-initializer" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683963 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="storage-initializer" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683971 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.683979 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.684012 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.684026 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.684021 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.684451 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.684112 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="2682b351-e998-473e-9295-a03bcc780b9e" containerName="kserve-container" Apr 16 19:20:50.684451 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.684125 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.684451 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.684145 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="cfefdebd-253d-4787-a22d-898c29bc379d" containerName="storage-initializer" Apr 16 19:20:50.688762 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.688741 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.691165 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.691146 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"storage-config\"" Apr 16 19:20:50.691165 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.691156 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-tdsxq\"" Apr 16 19:20:50.691349 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.691203 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:20:50.694957 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.694937 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:20:50.845597 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.845558 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.845791 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.845680 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.946943 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.946838 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.946943 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.946901 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.947273 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.947256 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.947555 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.947533 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert\") pod \"isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:50.999679 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:50.999650 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:51.122544 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:51.122508 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:20:51.125648 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:20:51.125615 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca8a42b9_1e14_406d_bc0c_d0a7b8360f9d.slice/crio-2b8b74e9b54cdb3c9aec0cc053fdcd37f44b1b60c6a497b0a600822442e38f71 WatchSource:0}: Error finding container 2b8b74e9b54cdb3c9aec0cc053fdcd37f44b1b60c6a497b0a600822442e38f71: Status 404 returned error can't find the container with id 2b8b74e9b54cdb3c9aec0cc053fdcd37f44b1b60c6a497b0a600822442e38f71 Apr 16 19:20:51.377916 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:51.377883 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerStarted","Data":"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883"} Apr 16 19:20:51.377916 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:51.377927 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerStarted","Data":"2b8b74e9b54cdb3c9aec0cc053fdcd37f44b1b60c6a497b0a600822442e38f71"} Apr 16 19:20:52.383672 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:52.383639 2566 generic.go:358] "Generic (PLEG): container finished" podID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerID="0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883" exitCode=0 Apr 16 19:20:52.384064 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:52.383707 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerDied","Data":"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883"} Apr 16 19:20:53.389062 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:53.389022 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerStarted","Data":"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233"} Apr 16 19:20:53.389528 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:53.389242 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:20:53.390512 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:53.390488 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:20:53.409926 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:53.409880 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podStartSLOduration=3.4098694959999998 podStartE2EDuration="3.409869496s" podCreationTimestamp="2026-04-16 19:20:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:20:53.408062129 +0000 UTC m=+3829.500350094" watchObservedRunningTime="2026-04-16 19:20:53.409869496 +0000 UTC m=+3829.502157461" Apr 16 19:20:54.397055 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:20:54.397016 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:04.393254 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:04.393208 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:14.394064 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:14.394019 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:24.393334 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:24.393249 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:34.393863 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:34.393816 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:44.393235 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:44.393191 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:21:54.394236 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:21:54.394194 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:22:04.394682 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:04.394649 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:22:10.753363 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:10.753327 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:22:10.753822 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:10.753574 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" containerID="cri-o://a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233" gracePeriod=30 Apr 16 19:22:11.830484 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:11.830443 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:11.834105 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:11.834081 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:11.843761 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:11.843737 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:11.905763 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:11.905729 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh\" (UID: \"bebaa427-79c2-4d46-a113-2d1ff8907add\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:12.006343 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.006302 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh\" (UID: \"bebaa427-79c2-4d46-a113-2d1ff8907add\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:12.006694 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.006670 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh\" (UID: \"bebaa427-79c2-4d46-a113-2d1ff8907add\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:12.144989 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.144955 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:12.271867 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.271842 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:12.274548 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:22:12.274510 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbebaa427_79c2_4d46_a113_2d1ff8907add.slice/crio-69b2694381a22331b71748b41d2545a3fd33a25ca6affb75d20548a9486284d4 WatchSource:0}: Error finding container 69b2694381a22331b71748b41d2545a3fd33a25ca6affb75d20548a9486284d4: Status 404 returned error can't find the container with id 69b2694381a22331b71748b41d2545a3fd33a25ca6affb75d20548a9486284d4 Apr 16 19:22:12.629302 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.629270 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerStarted","Data":"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3"} Apr 16 19:22:12.629302 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:12.629306 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerStarted","Data":"69b2694381a22331b71748b41d2545a3fd33a25ca6affb75d20548a9486284d4"} Apr 16 19:22:14.394073 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:14.394031 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.67:8080: connect: connection refused" Apr 16 19:22:15.000944 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.000920 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:22:15.029045 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.028956 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert\") pod \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " Apr 16 19:22:15.029195 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.029086 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location\") pod \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\" (UID: \"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d\") " Apr 16 19:22:15.029377 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.029351 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" (UID: "ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:22:15.029452 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.029352 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" (UID: "ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:22:15.130460 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.130428 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:22:15.130460 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.130454 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d-cabundle-cert\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:22:15.640545 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.640512 2566 generic.go:358] "Generic (PLEG): container finished" podID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerID="a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233" exitCode=0 Apr 16 19:22:15.640971 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.640567 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerDied","Data":"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233"} Apr 16 19:22:15.640971 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.640577 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" Apr 16 19:22:15.640971 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.640593 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x" event={"ID":"ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d","Type":"ContainerDied","Data":"2b8b74e9b54cdb3c9aec0cc053fdcd37f44b1b60c6a497b0a600822442e38f71"} Apr 16 19:22:15.640971 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.640610 2566 scope.go:117] "RemoveContainer" containerID="a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233" Apr 16 19:22:15.648754 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.648728 2566 scope.go:117] "RemoveContainer" containerID="0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883" Apr 16 19:22:15.655794 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.655775 2566 scope.go:117] "RemoveContainer" containerID="a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233" Apr 16 19:22:15.656030 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:22:15.656007 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233\": container with ID starting with a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233 not found: ID does not exist" containerID="a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233" Apr 16 19:22:15.656085 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.656038 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233"} err="failed to get container status \"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233\": rpc error: code = NotFound desc = could not find container \"a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233\": container with ID starting with a48d8a36b20075ba898c824e2747946cd686a6ad95540a4157b05274dd28c233 not found: ID does not exist" Apr 16 19:22:15.656085 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.656054 2566 scope.go:117] "RemoveContainer" containerID="0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883" Apr 16 19:22:15.656306 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:22:15.656288 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883\": container with ID starting with 0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883 not found: ID does not exist" containerID="0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883" Apr 16 19:22:15.656352 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.656313 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883"} err="failed to get container status \"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883\": rpc error: code = NotFound desc = could not find container \"0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883\": container with ID starting with 0d3d22866b430f1720fa002ff61d55792b7f056047618489393d1c07513dd883 not found: ID does not exist" Apr 16 19:22:15.664471 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.664449 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:22:15.669056 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:15.669036 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-pass-predictor-566f78cd49-bjs8x"] Apr 16 19:22:16.452260 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:16.452231 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" path="/var/lib/kubelet/pods/ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d/volumes" Apr 16 19:22:16.644821 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:16.644798 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/0.log" Apr 16 19:22:16.645235 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:16.644832 2566 generic.go:358] "Generic (PLEG): container finished" podID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerID="0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3" exitCode=1 Apr 16 19:22:16.645235 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:16.644910 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerDied","Data":"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3"} Apr 16 19:22:17.650340 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:17.650312 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/0.log" Apr 16 19:22:17.650702 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:17.650416 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerStarted","Data":"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b"} Apr 16 19:22:21.897945 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:21.897911 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:21.898484 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:21.898248 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" containerID="cri-o://e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b" gracePeriod=30 Apr 16 19:22:22.910233 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910200 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:22:22.910585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910523 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" Apr 16 19:22:22.910585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910533 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" Apr 16 19:22:22.910585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910549 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="storage-initializer" Apr 16 19:22:22.910585 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910555 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="storage-initializer" Apr 16 19:22:22.910716 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.910607 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="ca8a42b9-1e14-406d-bc0c-d0a7b8360f9d" containerName="kserve-container" Apr 16 19:22:22.913651 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.913633 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:22.916197 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.916178 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 16 19:22:22.921944 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.921922 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:22:22.991903 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.991871 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:22.992067 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:22.991910 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.093191 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.093146 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.093191 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.093194 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.093557 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.093541 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.093736 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.093719 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert\") pod \"isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.225039 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.224935 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:23.357232 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.357097 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:22:23.359393 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:22:23.359366 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5e50a35_0d90_4197_9ca0_69e34d6fb4f2.slice/crio-41d1751ba7cea21b9a03b5c957f7a974c4621c6550c648de9fd26a2908f38513 WatchSource:0}: Error finding container 41d1751ba7cea21b9a03b5c957f7a974c4621c6550c648de9fd26a2908f38513: Status 404 returned error can't find the container with id 41d1751ba7cea21b9a03b5c957f7a974c4621c6550c648de9fd26a2908f38513 Apr 16 19:22:23.531771 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.531752 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/1.log" Apr 16 19:22:23.532107 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.532087 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/0.log" Apr 16 19:22:23.532211 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.532147 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:23.597600 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.597575 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location\") pod \"bebaa427-79c2-4d46-a113-2d1ff8907add\" (UID: \"bebaa427-79c2-4d46-a113-2d1ff8907add\") " Apr 16 19:22:23.597839 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.597818 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "bebaa427-79c2-4d46-a113-2d1ff8907add" (UID: "bebaa427-79c2-4d46-a113-2d1ff8907add"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:22:23.674576 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.674547 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/1.log" Apr 16 19:22:23.674927 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.674913 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh_bebaa427-79c2-4d46-a113-2d1ff8907add/storage-initializer/0.log" Apr 16 19:22:23.674987 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.674946 2566 generic.go:358] "Generic (PLEG): container finished" podID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerID="e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b" exitCode=1 Apr 16 19:22:23.675058 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.675043 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" Apr 16 19:22:23.675141 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.675035 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerDied","Data":"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b"} Apr 16 19:22:23.675198 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.675162 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh" event={"ID":"bebaa427-79c2-4d46-a113-2d1ff8907add","Type":"ContainerDied","Data":"69b2694381a22331b71748b41d2545a3fd33a25ca6affb75d20548a9486284d4"} Apr 16 19:22:23.675198 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.675184 2566 scope.go:117] "RemoveContainer" containerID="e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b" Apr 16 19:22:23.676543 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.676521 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerStarted","Data":"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc"} Apr 16 19:22:23.676658 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.676553 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerStarted","Data":"41d1751ba7cea21b9a03b5c957f7a974c4621c6550c648de9fd26a2908f38513"} Apr 16 19:22:23.684825 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.684810 2566 scope.go:117] "RemoveContainer" containerID="0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3" Apr 16 19:22:23.692132 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.692115 2566 scope.go:117] "RemoveContainer" containerID="e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b" Apr 16 19:22:23.692381 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:22:23.692356 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b\": container with ID starting with e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b not found: ID does not exist" containerID="e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b" Apr 16 19:22:23.692437 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.692383 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b"} err="failed to get container status \"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b\": rpc error: code = NotFound desc = could not find container \"e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b\": container with ID starting with e244de7513db255e3eb49817f90da57c9bbaef14ce66eb0233f6a19aeb7ba81b not found: ID does not exist" Apr 16 19:22:23.692437 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.692402 2566 scope.go:117] "RemoveContainer" containerID="0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3" Apr 16 19:22:23.692663 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:22:23.692647 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3\": container with ID starting with 0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3 not found: ID does not exist" containerID="0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3" Apr 16 19:22:23.692717 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.692673 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3"} err="failed to get container status \"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3\": rpc error: code = NotFound desc = could not find container \"0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3\": container with ID starting with 0a0677ba0271cb4c43ca0bd0f161389a50b10c5050c3e9643e58e35f9f240db3 not found: ID does not exist" Apr 16 19:22:23.698247 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.698227 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/bebaa427-79c2-4d46-a113-2d1ff8907add-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:22:23.724586 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.724547 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:23.726351 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:23.726331 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-custom-fail-predictor-665c965d55-jhdzh"] Apr 16 19:22:24.452084 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:24.452056 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" path="/var/lib/kubelet/pods/bebaa427-79c2-4d46-a113-2d1ff8907add/volumes" Apr 16 19:22:24.682787 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:24.682760 2566 generic.go:358] "Generic (PLEG): container finished" podID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerID="1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc" exitCode=0 Apr 16 19:22:24.682937 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:24.682842 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerDied","Data":"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc"} Apr 16 19:22:25.687685 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:25.687649 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerStarted","Data":"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5"} Apr 16 19:22:25.688130 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:25.687826 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:22:25.689191 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:25.689166 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:22:25.706358 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:25.706301 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podStartSLOduration=3.706288792 podStartE2EDuration="3.706288792s" podCreationTimestamp="2026-04-16 19:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:22:25.704503359 +0000 UTC m=+3921.796791345" watchObservedRunningTime="2026-04-16 19:22:25.706288792 +0000 UTC m=+3921.798576756" Apr 16 19:22:26.690874 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:26.690836 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:22:36.691056 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:36.691014 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:22:46.691478 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:46.691430 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:22:56.691795 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:22:56.691709 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:23:06.691440 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:06.691399 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:23:16.691553 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:16.691507 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:23:26.691342 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:26.691294 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:23:33.449839 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:33.449812 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:23:42.984522 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:42.984487 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:23:42.985025 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:42.984851 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" containerID="cri-o://da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5" gracePeriod=30 Apr 16 19:23:43.449560 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:43.449523 2566 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" probeResult="failure" output="dial tcp 10.132.0.69:8080: connect: connection refused" Apr 16 19:23:44.046321 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046286 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:44.046678 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046657 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.046678 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046671 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.046783 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046732 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.046826 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046816 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.046862 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046828 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.046918 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.046899 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="bebaa427-79c2-4d46-a113-2d1ff8907add" containerName="storage-initializer" Apr 16 19:23:44.049769 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.049748 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:44.059977 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.059948 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:44.151152 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.151124 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85\" (UID: \"1d784b08-2f86-4c75-bf0d-64efcfbac669\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:44.251607 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.251578 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85\" (UID: \"1d784b08-2f86-4c75-bf0d-64efcfbac669\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:44.251929 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.251911 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location\") pod \"isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85\" (UID: \"1d784b08-2f86-4c75-bf0d-64efcfbac669\") " pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:44.361943 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.361907 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:44.481785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.481746 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:44.485037 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:23:44.484982 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d784b08_2f86_4c75_bf0d_64efcfbac669.slice/crio-82d191c9af46451b2a1d6944399c944bc308a4a1fd851c3b1a5114eba3ba00cf WatchSource:0}: Error finding container 82d191c9af46451b2a1d6944399c944bc308a4a1fd851c3b1a5114eba3ba00cf: Status 404 returned error can't find the container with id 82d191c9af46451b2a1d6944399c944bc308a4a1fd851c3b1a5114eba3ba00cf Apr 16 19:23:44.939378 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.939341 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerStarted","Data":"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258"} Apr 16 19:23:44.939378 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:44.939377 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerStarted","Data":"82d191c9af46451b2a1d6944399c944bc308a4a1fd851c3b1a5114eba3ba00cf"} Apr 16 19:23:47.426519 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.426494 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:23:47.478015 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.477965 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location\") pod \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " Apr 16 19:23:47.478196 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.478115 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert\") pod \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\" (UID: \"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2\") " Apr 16 19:23:47.478270 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.478216 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" (UID: "a5e50a35-0d90-4197-9ca0-69e34d6fb4f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:23:47.478364 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.478348 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:23:47.478440 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.478421 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" (UID: "a5e50a35-0d90-4197-9ca0-69e34d6fb4f2"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:23:47.578862 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.578764 2566 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2-cabundle-cert\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:23:47.950534 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.950500 2566 generic.go:358] "Generic (PLEG): container finished" podID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerID="da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5" exitCode=0 Apr 16 19:23:47.950732 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.950550 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerDied","Data":"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5"} Apr 16 19:23:47.950732 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.950564 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" Apr 16 19:23:47.950732 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.950576 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n" event={"ID":"a5e50a35-0d90-4197-9ca0-69e34d6fb4f2","Type":"ContainerDied","Data":"41d1751ba7cea21b9a03b5c957f7a974c4621c6550c648de9fd26a2908f38513"} Apr 16 19:23:47.950732 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.950595 2566 scope.go:117] "RemoveContainer" containerID="da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5" Apr 16 19:23:47.959323 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.959304 2566 scope.go:117] "RemoveContainer" containerID="1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc" Apr 16 19:23:47.968031 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.968011 2566 scope.go:117] "RemoveContainer" containerID="da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5" Apr 16 19:23:47.968411 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:23:47.968381 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5\": container with ID starting with da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5 not found: ID does not exist" containerID="da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5" Apr 16 19:23:47.968493 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.968421 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5"} err="failed to get container status \"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5\": rpc error: code = NotFound desc = could not find container \"da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5\": container with ID starting with da1f767604089c800da9c45c4634f20fcf13e61df88cafdd8a3926771592c0f5 not found: ID does not exist" Apr 16 19:23:47.968493 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.968457 2566 scope.go:117] "RemoveContainer" containerID="1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc" Apr 16 19:23:47.968727 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:23:47.968703 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc\": container with ID starting with 1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc not found: ID does not exist" containerID="1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc" Apr 16 19:23:47.968773 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.968735 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc"} err="failed to get container status \"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc\": rpc error: code = NotFound desc = could not find container \"1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc\": container with ID starting with 1f44ab764e7f370c5126d8828da7f133d89bd47a6a111cbf24d84015aa03dddc not found: ID does not exist" Apr 16 19:23:47.974701 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.974679 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:23:47.979797 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:47.979766 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-pass-predictor-5f684c44cb-q4p8n"] Apr 16 19:23:48.452017 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:48.451953 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" path="/var/lib/kubelet/pods/a5e50a35-0d90-4197-9ca0-69e34d6fb4f2/volumes" Apr 16 19:23:51.966064 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:51.966040 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/0.log" Apr 16 19:23:51.966467 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:51.966075 2566 generic.go:358] "Generic (PLEG): container finished" podID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerID="2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258" exitCode=1 Apr 16 19:23:51.966467 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:51.966115 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerDied","Data":"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258"} Apr 16 19:23:52.970362 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:52.970335 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/0.log" Apr 16 19:23:52.970736 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:52.970433 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerStarted","Data":"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf"} Apr 16 19:23:54.068572 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:54.068541 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:54.068939 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:54.068786 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" containerID="cri-o://7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf" gracePeriod=30 Apr 16 19:23:56.111135 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.111114 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/1.log" Apr 16 19:23:56.111445 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.111434 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/0.log" Apr 16 19:23:56.111505 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.111493 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:56.253975 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.253895 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location\") pod \"1d784b08-2f86-4c75-bf0d-64efcfbac669\" (UID: \"1d784b08-2f86-4c75-bf0d-64efcfbac669\") " Apr 16 19:23:56.254224 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.254192 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1d784b08-2f86-4c75-bf0d-64efcfbac669" (UID: "1d784b08-2f86-4c75-bf0d-64efcfbac669"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:23:56.355009 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.354958 2566 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1d784b08-2f86-4c75-bf0d-64efcfbac669-kserve-provision-location\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:23:56.411365 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411332 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kvlwf/must-gather-jk9bb"] Apr 16 19:23:56.411652 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411641 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" Apr 16 19:23:56.411696 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411654 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" Apr 16 19:23:56.411696 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411664 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.411696 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411670 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.411696 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411678 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.411696 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411683 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.411847 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411699 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="storage-initializer" Apr 16 19:23:56.411847 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411705 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="storage-initializer" Apr 16 19:23:56.411847 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411752 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.411847 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411759 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5e50a35-0d90-4197-9ca0-69e34d6fb4f2" containerName="kserve-container" Apr 16 19:23:56.411847 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.411768 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerName="storage-initializer" Apr 16 19:23:56.414903 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.414888 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.417574 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.417536 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kvlwf\"/\"openshift-service-ca.crt\"" Apr 16 19:23:56.417574 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.417559 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kvlwf\"/\"kube-root-ca.crt\"" Apr 16 19:23:56.417731 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.417569 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kvlwf\"/\"default-dockercfg-p8l5s\"" Apr 16 19:23:56.425331 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.425310 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvlwf/must-gather-jk9bb"] Apr 16 19:23:56.556757 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.556668 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dn7w\" (UniqueName: \"kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.556757 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.556746 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.657459 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.657426 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4dn7w\" (UniqueName: \"kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.657459 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.657464 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.657766 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.657750 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.666182 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.666160 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dn7w\" (UniqueName: \"kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w\") pod \"must-gather-jk9bb\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.739317 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.739283 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:23:56.856157 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.856126 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kvlwf/must-gather-jk9bb"] Apr 16 19:23:56.859326 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:23:56.859297 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13075abf_c0be_40ad_8d8b_3520fa523d38.slice/crio-46d0df3c23905588a3934a777a981b6bd4ac9d8134b1f94c5d87f778c71e0085 WatchSource:0}: Error finding container 46d0df3c23905588a3934a777a981b6bd4ac9d8134b1f94c5d87f778c71e0085: Status 404 returned error can't find the container with id 46d0df3c23905588a3934a777a981b6bd4ac9d8134b1f94c5d87f778c71e0085 Apr 16 19:23:56.984788 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.984762 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/1.log" Apr 16 19:23:56.985145 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985125 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85_1d784b08-2f86-4c75-bf0d-64efcfbac669/storage-initializer/0.log" Apr 16 19:23:56.985236 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985165 2566 generic.go:358] "Generic (PLEG): container finished" podID="1d784b08-2f86-4c75-bf0d-64efcfbac669" containerID="7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf" exitCode=1 Apr 16 19:23:56.985296 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985238 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" Apr 16 19:23:56.985296 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985238 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerDied","Data":"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf"} Apr 16 19:23:56.985296 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985282 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85" event={"ID":"1d784b08-2f86-4c75-bf0d-64efcfbac669","Type":"ContainerDied","Data":"82d191c9af46451b2a1d6944399c944bc308a4a1fd851c3b1a5114eba3ba00cf"} Apr 16 19:23:56.985458 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.985307 2566 scope.go:117] "RemoveContainer" containerID="7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf" Apr 16 19:23:56.986610 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.986324 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" event={"ID":"13075abf-c0be-40ad-8d8b-3520fa523d38","Type":"ContainerStarted","Data":"46d0df3c23905588a3934a777a981b6bd4ac9d8134b1f94c5d87f778c71e0085"} Apr 16 19:23:56.993027 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.993009 2566 scope.go:117] "RemoveContainer" containerID="2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258" Apr 16 19:23:56.999745 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:56.999730 2566 scope.go:117] "RemoveContainer" containerID="7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf" Apr 16 19:23:56.999988 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:23:56.999969 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf\": container with ID starting with 7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf not found: ID does not exist" containerID="7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf" Apr 16 19:23:57.000086 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:57.000011 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf"} err="failed to get container status \"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf\": rpc error: code = NotFound desc = could not find container \"7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf\": container with ID starting with 7fea6699888de6a7eaf07071cd4cb960f60788e8d2a3408826857b3cd07fd7bf not found: ID does not exist" Apr 16 19:23:57.000086 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:57.000028 2566 scope.go:117] "RemoveContainer" containerID="2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258" Apr 16 19:23:57.000243 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:23:57.000224 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258\": container with ID starting with 2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258 not found: ID does not exist" containerID="2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258" Apr 16 19:23:57.000280 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:57.000248 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258"} err="failed to get container status \"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258\": rpc error: code = NotFound desc = could not find container \"2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258\": container with ID starting with 2a9a149e71573379ab445849bc1ffdf8ed6ae284fc2a52ba4b45ffac17fcb258 not found: ID does not exist" Apr 16 19:23:57.019184 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:57.019154 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:57.021786 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:57.021764 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-s3-tls-serving-fail-predictor-57b648fdbb-n4p85"] Apr 16 19:23:58.452590 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:23:58.452561 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d784b08-2f86-4c75-bf0d-64efcfbac669" path="/var/lib/kubelet/pods/1d784b08-2f86-4c75-bf0d-64efcfbac669/volumes" Apr 16 19:24:02.013646 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:02.013607 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" event={"ID":"13075abf-c0be-40ad-8d8b-3520fa523d38","Type":"ContainerStarted","Data":"c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107"} Apr 16 19:24:02.013646 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:02.013649 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" event={"ID":"13075abf-c0be-40ad-8d8b-3520fa523d38","Type":"ContainerStarted","Data":"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c"} Apr 16 19:24:02.041718 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:02.041656 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" podStartSLOduration=1.411186056 podStartE2EDuration="6.041636575s" podCreationTimestamp="2026-04-16 19:23:56 +0000 UTC" firstStartedPulling="2026-04-16 19:23:56.861012154 +0000 UTC m=+4012.953300096" lastFinishedPulling="2026-04-16 19:24:01.491462669 +0000 UTC m=+4017.583750615" observedRunningTime="2026-04-16 19:24:02.039470247 +0000 UTC m=+4018.131758213" watchObservedRunningTime="2026-04-16 19:24:02.041636575 +0000 UTC m=+4018.133924541" Apr 16 19:24:23.084654 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:23.084618 2566 generic.go:358] "Generic (PLEG): container finished" podID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerID="53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c" exitCode=0 Apr 16 19:24:23.085102 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:23.084692 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" event={"ID":"13075abf-c0be-40ad-8d8b-3520fa523d38","Type":"ContainerDied","Data":"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c"} Apr 16 19:24:23.085102 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:23.085022 2566 scope.go:117] "RemoveContainer" containerID="53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c" Apr 16 19:24:23.554365 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:23.554290 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvlwf_must-gather-jk9bb_13075abf-c0be-40ad-8d8b-3520fa523d38/gather/0.log" Apr 16 19:24:27.141753 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:27.141720 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-wtxhb_5504c48e-268b-4506-8112-5817466db907/global-pull-secret-syncer/0.log" Apr 16 19:24:27.207187 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:27.207147 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-6tlbq_1c68ba07-dba4-4de5-923b-da334bafc1fb/konnectivity-agent/0.log" Apr 16 19:24:27.322868 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:27.322833 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-136-226.ec2.internal_772bd1ceeee63d6bb44b05ffabbad76d/haproxy/0.log" Apr 16 19:24:29.093404 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.093372 2566 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-kvlwf/must-gather-jk9bb"] Apr 16 19:24:29.093808 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.093600 2566 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="copy" containerID="cri-o://c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107" gracePeriod=2 Apr 16 19:24:29.099660 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.099622 2566 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-kvlwf/must-gather-jk9bb"] Apr 16 19:24:29.318239 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.318220 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvlwf_must-gather-jk9bb_13075abf-c0be-40ad-8d8b-3520fa523d38/copy/0.log" Apr 16 19:24:29.318578 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.318563 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:24:29.320876 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.320855 2566 status_manager.go:895] "Failed to get status for pod" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" err="pods \"must-gather-jk9bb\" is forbidden: User \"system:node:ip-10-0-136-226.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kvlwf\": no relationship found between node 'ip-10-0-136-226.ec2.internal' and this object" Apr 16 19:24:29.334212 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.334197 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dn7w\" (UniqueName: \"kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w\") pod \"13075abf-c0be-40ad-8d8b-3520fa523d38\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " Apr 16 19:24:29.334268 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.334245 2566 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output\") pod \"13075abf-c0be-40ad-8d8b-3520fa523d38\" (UID: \"13075abf-c0be-40ad-8d8b-3520fa523d38\") " Apr 16 19:24:29.335747 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.335721 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "13075abf-c0be-40ad-8d8b-3520fa523d38" (UID: "13075abf-c0be-40ad-8d8b-3520fa523d38"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:24:29.336360 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.336338 2566 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w" (OuterVolumeSpecName: "kube-api-access-4dn7w") pod "13075abf-c0be-40ad-8d8b-3520fa523d38" (UID: "13075abf-c0be-40ad-8d8b-3520fa523d38"). InnerVolumeSpecName "kube-api-access-4dn7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:24:29.434852 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.434761 2566 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4dn7w\" (UniqueName: \"kubernetes.io/projected/13075abf-c0be-40ad-8d8b-3520fa523d38-kube-api-access-4dn7w\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:24:29.434852 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:29.434791 2566 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/13075abf-c0be-40ad-8d8b-3520fa523d38-must-gather-output\") on node \"ip-10-0-136-226.ec2.internal\" DevicePath \"\"" Apr 16 19:24:30.106660 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.106625 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-kvlwf_must-gather-jk9bb_13075abf-c0be-40ad-8d8b-3520fa523d38/copy/0.log" Apr 16 19:24:30.107122 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.106975 2566 generic.go:358] "Generic (PLEG): container finished" podID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerID="c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107" exitCode=143 Apr 16 19:24:30.107122 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.107058 2566 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" Apr 16 19:24:30.107122 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.107078 2566 scope.go:117] "RemoveContainer" containerID="c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107" Apr 16 19:24:30.110033 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.110008 2566 status_manager.go:895] "Failed to get status for pod" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" err="pods \"must-gather-jk9bb\" is forbidden: User \"system:node:ip-10-0-136-226.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kvlwf\": no relationship found between node 'ip-10-0-136-226.ec2.internal' and this object" Apr 16 19:24:30.115428 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.115413 2566 scope.go:117] "RemoveContainer" containerID="53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c" Apr 16 19:24:30.117453 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.117429 2566 status_manager.go:895] "Failed to get status for pod" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" pod="openshift-must-gather-kvlwf/must-gather-jk9bb" err="pods \"must-gather-jk9bb\" is forbidden: User \"system:node:ip-10-0-136-226.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-kvlwf\": no relationship found between node 'ip-10-0-136-226.ec2.internal' and this object" Apr 16 19:24:30.127521 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.127500 2566 scope.go:117] "RemoveContainer" containerID="c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107" Apr 16 19:24:30.127768 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:24:30.127747 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107\": container with ID starting with c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107 not found: ID does not exist" containerID="c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107" Apr 16 19:24:30.127822 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.127777 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107"} err="failed to get container status \"c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107\": rpc error: code = NotFound desc = could not find container \"c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107\": container with ID starting with c0748503ae7d07d7dd166fbdefcc9de7febe7148e9bc0e7d91649a118db95107 not found: ID does not exist" Apr 16 19:24:30.127822 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.127795 2566 scope.go:117] "RemoveContainer" containerID="53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c" Apr 16 19:24:30.128066 ip-10-0-136-226 kubenswrapper[2566]: E0416 19:24:30.128043 2566 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c\": container with ID starting with 53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c not found: ID does not exist" containerID="53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c" Apr 16 19:24:30.128120 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.128076 2566 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c"} err="failed to get container status \"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c\": rpc error: code = NotFound desc = could not find container \"53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c\": container with ID starting with 53bbd4ccaf76a39c01fd742ce603777f7f3453b04b1090ea8b8eea15e3d5678c not found: ID does not exist" Apr 16 19:24:30.452500 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.452417 2566 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" path="/var/lib/kubelet/pods/13075abf-c0be-40ad-8d8b-3520fa523d38/volumes" Apr 16 19:24:30.455845 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.455820 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rt4w5_4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579/kube-state-metrics/0.log" Apr 16 19:24:30.483497 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.483473 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rt4w5_4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579/kube-rbac-proxy-main/0.log" Apr 16 19:24:30.512148 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.512119 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-rt4w5_4d33d8d2-1b22-4a2b-bdef-ebdfa8fdc579/kube-rbac-proxy-self/0.log" Apr 16 19:24:30.629928 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.629906 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jm7p_285a85b9-1863-44eb-9c99-d65ffce469c1/node-exporter/0.log" Apr 16 19:24:30.660581 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.660562 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jm7p_285a85b9-1863-44eb-9c99-d65ffce469c1/kube-rbac-proxy/0.log" Apr 16 19:24:30.689406 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.689381 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-8jm7p_285a85b9-1863-44eb-9c99-d65ffce469c1/init-textfile/0.log" Apr 16 19:24:30.930734 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.930708 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z79hs_04b00891-8f1a-4b47-bdb5-63b1933e788f/kube-rbac-proxy-main/0.log" Apr 16 19:24:30.965275 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.965246 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z79hs_04b00891-8f1a-4b47-bdb5-63b1933e788f/kube-rbac-proxy-self/0.log" Apr 16 19:24:30.999676 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:30.999645 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-z79hs_04b00891-8f1a-4b47-bdb5-63b1933e788f/openshift-state-metrics/0.log" Apr 16 19:24:31.075874 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.075832 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/prometheus/0.log" Apr 16 19:24:31.104392 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.104365 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/config-reloader/0.log" Apr 16 19:24:31.135471 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.135437 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/thanos-sidecar/0.log" Apr 16 19:24:31.163355 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.163333 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/kube-rbac-proxy-web/0.log" Apr 16 19:24:31.199174 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.199105 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/kube-rbac-proxy/0.log" Apr 16 19:24:31.224907 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.224888 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/kube-rbac-proxy-thanos/0.log" Apr 16 19:24:31.255958 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.255930 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_f5726efc-b17a-44d9-9703-efbe9a70152a/init-config-reloader/0.log" Apr 16 19:24:31.298453 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.298418 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7dtds_c759e2c5-22e6-4808-a546-dd9733343b92/prometheus-operator/0.log" Apr 16 19:24:31.321816 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.321775 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-7dtds_c759e2c5-22e6-4808-a546-dd9733343b92/kube-rbac-proxy/0.log" Apr 16 19:24:31.356430 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.356410 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-9cb97cd87-h5zzm_23e1b17b-ffd6-4cbd-acef-a6eae31b0283/prometheus-operator-admission-webhook/0.log" Apr 16 19:24:31.392665 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.392636 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f476949-898w5_d4f8ae75-cc0c-49bd-9485-0229eb626e51/telemeter-client/0.log" Apr 16 19:24:31.424415 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.424395 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f476949-898w5_d4f8ae75-cc0c-49bd-9485-0229eb626e51/reload/0.log" Apr 16 19:24:31.451981 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.451918 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6f476949-898w5_d4f8ae75-cc0c-49bd-9485-0229eb626e51/kube-rbac-proxy/0.log" Apr 16 19:24:31.493832 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.493804 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/thanos-query/0.log" Apr 16 19:24:31.534030 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.533981 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/kube-rbac-proxy-web/0.log" Apr 16 19:24:31.577211 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.577183 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/kube-rbac-proxy/0.log" Apr 16 19:24:31.614267 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.614247 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/prom-label-proxy/0.log" Apr 16 19:24:31.667914 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.667890 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/kube-rbac-proxy-rules/0.log" Apr 16 19:24:31.716628 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:31.716554 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-67d4b96bc7-zhxww_20d27a74-6f2b-4530-8101-ef28c10a70a7/kube-rbac-proxy-metrics/0.log" Apr 16 19:24:32.875594 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:32.875565 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-r2ztm_c7de8316-4440-4039-ae31-310a6c1146a9/networking-console-plugin/0.log" Apr 16 19:24:34.452324 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452284 2566 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx"] Apr 16 19:24:34.452785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452732 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="copy" Apr 16 19:24:34.452785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452749 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="copy" Apr 16 19:24:34.452785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452762 2566 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="gather" Apr 16 19:24:34.452785 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452770 2566 state_mem.go:107] "Deleted CPUSet assignment" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="gather" Apr 16 19:24:34.453031 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452845 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="gather" Apr 16 19:24:34.453031 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.452862 2566 memory_manager.go:356] "RemoveStaleState removing state" podUID="13075abf-c0be-40ad-8d8b-3520fa523d38" containerName="copy" Apr 16 19:24:34.457613 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.457589 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.461048 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.461028 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"openshift-service-ca.crt\"" Apr 16 19:24:34.461169 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.461046 2566 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7m7f8\"/\"kube-root-ca.crt\"" Apr 16 19:24:34.462171 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.462155 2566 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7m7f8\"/\"default-dockercfg-jlsz5\"" Apr 16 19:24:34.465265 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.465244 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx"] Apr 16 19:24:34.575370 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.575335 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-podres\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.575370 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.575372 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-lib-modules\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.575577 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.575406 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-sys\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.575577 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.575481 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbxf\" (UniqueName: \"kubernetes.io/projected/9400be88-0df4-4097-a55d-d3be3ccbcd6b-kube-api-access-snbxf\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.575577 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.575524 2566 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-proc\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676040 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.675986 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-podres\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676040 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676044 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-lib-modules\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676085 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-sys\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676135 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-snbxf\" (UniqueName: \"kubernetes.io/projected/9400be88-0df4-4097-a55d-d3be3ccbcd6b-kube-api-access-snbxf\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676138 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-podres\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676175 2566 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-proc\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676182 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-sys\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676206 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-lib-modules\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.676278 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.676240 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9400be88-0df4-4097-a55d-d3be3ccbcd6b-proc\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.686226 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.686198 2566 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbxf\" (UniqueName: \"kubernetes.io/projected/9400be88-0df4-4097-a55d-d3be3ccbcd6b-kube-api-access-snbxf\") pod \"perf-node-gather-daemonset-bx4bx\" (UID: \"9400be88-0df4-4097-a55d-d3be3ccbcd6b\") " pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.768364 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.768266 2566 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:34.898754 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.898730 2566 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx"] Apr 16 19:24:34.901584 ip-10-0-136-226 kubenswrapper[2566]: W0416 19:24:34.901552 2566 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9400be88_0df4_4097_a55d_d3be3ccbcd6b.slice/crio-61b7374453b49566b55b31d82f70b61c7aed75761f3212a9b4a7277103a28d35 WatchSource:0}: Error finding container 61b7374453b49566b55b31d82f70b61c7aed75761f3212a9b4a7277103a28d35: Status 404 returned error can't find the container with id 61b7374453b49566b55b31d82f70b61c7aed75761f3212a9b4a7277103a28d35 Apr 16 19:24:34.903456 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:34.903441 2566 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:24:35.028974 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.028896 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2zc2n_4a945a62-4bc4-4f09-8555-50569018d9ac/dns/0.log" Apr 16 19:24:35.060864 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.060831 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2zc2n_4a945a62-4bc4-4f09-8555-50569018d9ac/kube-rbac-proxy/0.log" Apr 16 19:24:35.125745 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.125698 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" event={"ID":"9400be88-0df4-4097-a55d-d3be3ccbcd6b","Type":"ContainerStarted","Data":"3f50839dfba54fb4ed79de286a54a38907c93b73d7865f9127e79faf7d50758f"} Apr 16 19:24:35.125745 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.125740 2566 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" event={"ID":"9400be88-0df4-4097-a55d-d3be3ccbcd6b","Type":"ContainerStarted","Data":"61b7374453b49566b55b31d82f70b61c7aed75761f3212a9b4a7277103a28d35"} Apr 16 19:24:35.126007 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.125759 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:35.145296 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.145249 2566 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" podStartSLOduration=1.145234102 podStartE2EDuration="1.145234102s" podCreationTimestamp="2026-04-16 19:24:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:24:35.143532938 +0000 UTC m=+4051.235820902" watchObservedRunningTime="2026-04-16 19:24:35.145234102 +0000 UTC m=+4051.237522458" Apr 16 19:24:35.210448 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.210422 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-5psgq_125ba5ab-da90-4b8d-b93b-56e647e63aff/dns-node-resolver/0.log" Apr 16 19:24:35.796822 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:35.796788 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-hp46j_b06d5b5e-fa9c-4211-acc1-3b2c5f851673/node-ca/0.log" Apr 16 19:24:37.137744 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:37.137716 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cpbgb_6b838b63-87c9-46b4-96a1-ed246b230c36/serve-healthcheck-canary/0.log" Apr 16 19:24:37.598926 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:37.598899 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dz7b9_2021cc59-cdbb-4dd8-a90e-f8f2f331b558/kube-rbac-proxy/0.log" Apr 16 19:24:37.629930 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:37.629903 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dz7b9_2021cc59-cdbb-4dd8-a90e-f8f2f331b558/exporter/0.log" Apr 16 19:24:37.658893 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:37.658860 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dz7b9_2021cc59-cdbb-4dd8-a90e-f8f2f331b558/extractor/0.log" Apr 16 19:24:40.023788 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.023753 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-tmms8_08efaeba-fdcb-44d5-bec2-39a299e6eb3d/manager/0.log" Apr 16 19:24:40.425483 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.425453 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-pmh5s_f661a39a-8433-4d1d-9c24-aea36bb0c831/manager/0.log" Apr 16 19:24:40.494866 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.494837 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-hdzgn_5d51deaf-b55f-4111-afb9-e258cd821a00/s3-init/0.log" Apr 16 19:24:40.549266 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.549243 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-custom-44qth_fd984710-2bda-435c-862c-c3bbe07c161f/s3-tls-init-custom/0.log" Apr 16 19:24:40.580489 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.580462 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-tls-init-serving-vlrxn_527f11d7-de15-47e3-a52b-b37ec6de8cca/s3-tls-init-serving/0.log" Apr 16 19:24:40.617052 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:40.617026 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-sxf6f_95a3dbb4-48a9-4b3e-9073-dd14bb5891f7/seaweedfs/0.log" Apr 16 19:24:41.138184 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:41.138160 2566 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7m7f8/perf-node-gather-daemonset-bx4bx" Apr 16 19:24:47.039722 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.039692 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/kube-multus-additional-cni-plugins/0.log" Apr 16 19:24:47.073227 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.073204 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/egress-router-binary-copy/0.log" Apr 16 19:24:47.103819 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.103792 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/cni-plugins/0.log" Apr 16 19:24:47.129728 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.129702 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/bond-cni-plugin/0.log" Apr 16 19:24:47.166050 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.166024 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/routeoverride-cni/0.log" Apr 16 19:24:47.194800 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.194780 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/whereabouts-cni-bincopy/0.log" Apr 16 19:24:47.224082 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.224059 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-fl2g2_b9fbc6e2-6448-4213-ac02-c0df39de143e/whereabouts-cni/0.log" Apr 16 19:24:47.493904 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.493845 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-gs6xw_25aae314-4a74-4705-b118-50fda5694b79/kube-multus/0.log" Apr 16 19:24:47.614115 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.614089 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jj9db_bd001d43-c6f4-44f4-906e-c01f02068004/network-metrics-daemon/0.log" Apr 16 19:24:47.635782 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:47.635757 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jj9db_bd001d43-c6f4-44f4-906e-c01f02068004/kube-rbac-proxy/0.log" Apr 16 19:24:49.290613 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.290581 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/ovn-controller/0.log" Apr 16 19:24:49.383276 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.383241 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/ovn-acl-logging/0.log" Apr 16 19:24:49.448944 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.448916 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/kube-rbac-proxy-node/0.log" Apr 16 19:24:49.479092 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.479061 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:24:49.502957 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.502934 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/northd/0.log" Apr 16 19:24:49.531727 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.531701 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/nbdb/0.log" Apr 16 19:24:49.564884 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.564813 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/sbdb/0.log" Apr 16 19:24:49.754783 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:49.754749 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kxwr5_dca169f9-fe56-4084-aff9-5a447ae82401/ovnkube-controller/0.log" Apr 16 19:24:50.895035 ip-10-0-136-226 kubenswrapper[2566]: I0416 19:24:50.895008 2566 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-52dbn_3dd548ef-63ff-4ea7-825d-0fa73a6487db/network-check-target-container/0.log"